I remember this personally!
I was always stumped as to what the hell was happening.
> These special surfaces were called “overlays†because they appeared to overlay the desktop.
I have some vague memory of programs whose windows had funky shapes (i.e. not rectangular) also using overlays of some kind. Maybe that's a different sort of overlay?
This brings back memories of my old HP laptop with an Athlon 64 and a Radeon X200M. The crappy FGLRX driver only supported overlays (afair) and so when running something like Compiz it would transform the window with the green background but the video itself would stay in place and it would just stick parts of the video on top where it happened to overlap. I still remember being excited when the open source drivers finally gained support for r300 and could do proper textured video...
I had a Matrox Millenium card with a breakout box for capturing RCA, S-Video, and Cable TV; I'd watch TV on my Windows 98 SE2 computer, which was the craziest thing back then, but I always felt like the green-screen like effect was some kind of mysterious bug that I'd better not mess with, or video capture would break. Windows 98 was barely working on a good day, so it felt like the computer was in the process of failing in a graceful and useful way, so I'd better not push my luck.
Every so often you could get a glimpse of the man behind the curtain, by dragging the window quickly or the drivers stuttering, which would momentarily reveal the green color (or whatever color it was) before the video card resumed doing its thing. Switching between full screen and windowed mode probably also revealed the magic, or starting a game that attempted to grab the video hardware context. And of course sometimes other graphical content would have the exact right shade of color, and have video-displaying pixels.
I've used that trick as far as Windows XP, playing videos inside 3D models in programs like SketchUp
I made extensive use of this, when I found it by accident, in my Winamp skins and GUI programs!
Good to see the first comment there corrects him, and that it's not actually green pixels; at least for the Intel and nVidia drivers I've used before, it appears to be more of a dark magenta. It could be configurable or hardcoded somewhere in the driver, but I don't think it's fixed in hardware.
The desktop compositors takes the graphics content of all the windows, including their composition visuals, and combines them to form a full desktop image that is sent to the monitor.
...at the cost of latency and efficiency.
This unlocked some memories. I remember on my system the chroma colour not being green but some very dark shade of grey that was almost black but not really black… something like #010101
If you watch Twitch, you can see that all instances of the same emote in chat animate together. Then I tested this more generally in a web page, and the same thing happens - if the same gif is placed multiple times in a page, all instances of that gif will play in sync even if loaded at different times. I guess there's a similar idea in browsers then, where maybe there's only one memory representation of the gif across the page or the browser.
[dead]
I remember using overlay mode in Winamp AVS made it run much faster. Wonder why that was.
i had forgotten about this technique when i was at the excellent https://tnmoc.org recently, looking at their sgi irix exhibit featuring a webcam.
the latency of the camera feed on the crt screen was unbelievable even (especially?) by modern standards!
after a minute of pure wonder i remembered about overlays. still mighty impressive.
Iirc you can also set that "green screen" as wallpaper and have video as desktop background!
Or you could set the WinAmp AVS visualizer to render to overlay, and have it as your desktop background.
Yeah, I used to have a few "live wallpaper" type videos I'd use this way. Around the time AVC-ish algorithms were democratized by DivX. IIRC the player I used had #0000A0 as its overlay color... may have even been the DivX branded player.
...This is the oldest I've ever felt, unsure of my own memories regarding something I have to consult historical records about...
The irony is that in 2025, this answer is now wrong again. Starting with smartphones, scanout hardware supports multiple planes/overlays again that are composited on the fly by fixed function blocks. This bypasses having to power on the GPU and wasting memory bandwidth (a large amount of power use in a smartphone). No longer involves hacks with green pixels though.
Right, because we have alpha channels now
Not necessary for blending in video overlays, and wasteful. Well, necessary inside the overlay if that is where the controls should appear. Alpha blending is two reads, one write per pixel, for the whole affected region (whatever that is, could be the whole screen). An opaque overlay is one read, one write, only for every pixel in the desired rectangle.
"Nowadays, video rendering is no longer done with overlays."
Darn, I thought this explained why, after upgrading my GPU, videos playing in Chrome have a thin green stripe on their right edge.
Video rendering can still be done with overlays, but it's a little more substantial, involving separate planes with the locations configurable on the graphics card. Look up MPO, Multi-Plane Overlay.
Your green stripe is likely because of the classic combination of unclamped bilinear filtering and a texture that's larger than the output region being used as the drawing surface for the video.
This is simply wrong. Videos are still rendering in overlays in windows in Chrome right now. There are many reasons why overlays are still used.
I did some Googling on your behalf as I remember having something like that but can't reproduce it right now:
https://old.reddit.com/r/OLED_Gaming/comments/1kovgdx/green_...
I'd make sure your drivers are up to date before fiddling with Chrome flags though.
[dead]
They still use "overlays" - just they're a lot more featureful in modern implementations than "Replace all of one colour with another surface" - so they tend not to have the same limitations and quirks.
MS started exposing some capabilities using MPO in the windows 8 era [0], and they've pretty much always had pretty comprehensive composition pipes in hardware for mobile platforms due to the power/bandwidth limitations meaning compositing the display can be a significant fraction of the total device's performance.
I suspect green (or other block colour) artifacts on video edges are due to bugs/mismatches in specification with the hardware video decoder and how the app displays it, and the bugs that often fall out of that.
Most video compression requires pretty large blocks, normally from 16x16 up 64x64 depending on format, and that may not align with the actual video size (e.g. 1080 does not divide by either). But often implementations still need that extra surface, as things like motion vectors may still refer to data in the "invisible" portion. And it has to be filled with something. It's then real easy to fall into bugs where small numeric errors in things like blending, or even just mix-ups between the different "sizes" that surface has, to cause issues at the edges.
I suspect the other comment about using ANGLE/dx9/dx11-on-12 may be effective as it /also/ causes the hardware video decoder not to be used, which is often a source of "unexpected" differences or limitations that commonly cause errors like that.
[0] https://learn.microsoft.com/en-us/windows-hardware/drivers/d...
Fun Fact: This same sort of thing also happened on the Classic Macintosh Quadra 840AV, when running in 8-bit (256 color) mode. Playback of realtime video capture reserved color index #243 (a very dark green in the system palette), and ANYWHERE that color was used, it would be replaced with the live video. I created some cool effects using this back in the 90s.
Yes, other Mac AV models did the same thing. I remember doing this on the 6100/60AV that we had.
Are you the one who created the techno-anthem "pump up the jam"? Sweet!
This was a nice trick to protect text from copying. For instance, student assignments. Students could still use digital camera on CRT display, but 20 years ago cameras were costly and students did not have them. And typing text from scratch was a tedious job. So online served assignments were not shared too fast.
I had my first digital camera in 2003 and my first videocall-capable phone in 2004 (Nec e313). I was a kid
> And typing text from scratch was a tedious job
At least by typing them the typer might learning something. :)
Around 2005 digital cameras were commonplace. Mobile phones also had cameras by then, even if not very good ones by today's standards. Maybe you're talking about an earlier time?
2005 is pre iPhone.
While cameras were definitely common, they also weren't quiet as ubiquitous as they're today.
Lots of families only had them for trips etc, not readily available for kids to make photos of screens.
Many flip phones had cameras by then. For instance, the Razr V3 was the best selling phone of 2005, and had a 640×480px camera.
> 2005 is pre iPhone
I hate how incompetent tech writing and marketing rewrote and simplified mobile phone history into pre/post-iphone. Yes, we did have touchscreens, smartphones and camera-enabled devices many years before the iPhone. Arguably, on several metrics, many Symbian/Linux/blackberry phones of that era are better smartphones than today's iPhone/Androids as defined by hardware capabilities which got removed over time while arbitrary constraints got added on the software front.
Don't know about other markets but the first iPhone didn't sell well especially because it was behind the high end feature phones of the time for a more expensive price.
The iPhone which made it was the 3G.