This story, too, begins with noise. I was browsing the radio waves with a software radio, looking for mysteries to accompany my ginger tea. I had started to notice a wide-band spiky signal on a number of frequencies that only seemed to appear indoors. Some sort of interference from electronic devices, probably. Spoiler alert, it eventually led me to broadcast a webcam picture over the radio waves... but how?
It sounds like video
The mystery deepened when I listened to how this interference sounded like as an AM signal. It reminded me of a time I mistakenly plugged our home stereo system to the Nintendo console's video output and heard a very similar buzz.
Am I possibly listening to video? Why would there be analog video transmitting on any frequency, let alone inside my home?
If we plot the signal's amplitude against time we can see that there is a strong pulse exactly 60 times per second. This could be the vertical synchronisation signal of 60 Hz video. A shorter pulse (pictured above) can be seen repeating more frequently; it could be the horizontal one. Between these pulses there is what appears to be noise. Maybe, if we use the strong pulses for synchronisation and plot the amplitude of that noise as a two-dimensional picture, we could see something?
And sure enough, when main screen turn on, we get signal:
(I've hidden the bright synchronisation signal from this picture.)
It seems to be my Raspberry Pi's desktop with weirdly distorted greyscale colours! Somehow, some part of the monitor setup is radiating it quite loudly into the aether. The frequency I'm listening to is a multiple of the monitor's pixel clock frequency.
As it turns out, this vulnerability of some monitors has been known for a long time. In 1985, van Eck demonstrated how CRT monitors can be spied on from a distance[1]; and in 2004, Markus Kuhn showed that the same still works on flat-screen monitors[2]. The image is heavily distorted, but some shapes and even bigger text can be recognisable. Sometimes this kind of eavesdropping is called "van Eck phreaking", in reference to phone phreaking.
The next thought was, could we get any more information out of these images? Is there any information about colour?
Mapping all the colours
HDMI is fully digital; there is no linear dependency between pixel values and greyscale brightness in this amplitude image. I believe the brightness in the above image is related to the number of bit transitions over my radio's sampling time (which is around 8 bit-lengths); and in HDMI, this is dependent on many things, not just the actual RGB value of the pixel. HDMI also uses multiple differential wires that all are transmitting their own picture channels side by side.
This is why I don't think it's possible easy to reconstruct a clear picture of what's being shown on the screen, let alone decode any colours.
But could the reverse be possible? Could we control this phenomenon to draw the greyscale pictures of our choice on the receiver's screen? How about sending binary data by displaying alternating pixel values on the monitor?
My monitor uses 16-bit colours. There are "only" 65,536 different colours, so it's possible to go through all of them and see how each appears in the receiver. But it's not that simple; the bit-pattern of a HDMI pixel can actually get modified based on what came before it. And my radio isn't fast enough to even tell the bits apart anyway. What we could do is fill entire lines with one colour and average the received signal strength. We would then get a mapping for single-colour horizontal streaks (above). Assuming a long run of the same colour always produces the same bitstream, this could be good enough.
Here's the map of all the colours and their intensity in the radio receiver. (Whatever happens between 16,128 and 16,384? I don't know.)
Now, we can resample a greyscale image so that its pixels become short horizontal lines. Then, for every greyscale value find the closest matching RGB565 color in the above map. When we display this psychedelic hodge-podge of colour on the screen (on the right), enough of the above mapping seems to be preserved to produce a recognizable picture of a movie[3] on the receiver side (on the left):
These colours are not constant in any way. If I move the antenna around, even if I turn it from vertical to horizontal, the greyscales will shift or even get inverted. If I tune the radio to another harmonic of the pixel clock frequency, the image seems to break down completely. (Are there more secrets to be unfolded in these variations?)
The binary exfiltration protocol
Now we should have enough information to be able to transmit bits. Maybe even big files and streaming data, depending on the bitrate we can achieve.
First of all, how should one bit be encoded? The absolute brightness will fluctuate depending on radio conditions. So I decided to encode bits as the brightness difference between two short horizontal lines. Positive difference means 1 and negative 0. This should stay fairly constant, unless the colours completely flip around that is.
The monitor has 768 pixels vertically. This is a nice number so I designed a packet that runs vertically across the display. (This proved to be a bad decision, as we will later see.) We can stack as many packets side-by-side as the monitor width allows. A new batch of packets can be displayed in each frame, or we can repeat them over multiple frames to improve reliability.
These packets should have some metadata, at least a sequence number. Our medium is also quite noisy, so we need some kind of forward error correction. I'm using a Hamming(12,8) code which adds 4 error correction bits for every 8 bits of data. Finally, we need to add a CRC to each packet so we can make sure it arrived intact; I chose CRC16 with the polynomial 0x8005
(just because liquid-dsp provided it by default).
First results!
It was quite unbelievable, I was able to transmit a looping 64 kbps audio stream almost without any glitches, with the monitor and the receiver in the same room approximately 2 meters from each other.
Quick tip. Raw 8-bit PCM audio is a nice test format for these kinds of streaming experiments. It's straightforward to set an arbitrary bitrate by resampling the sound (with SoX for instance); there's no structure, headers, or byte order to deal with; and any packet loss, misorder, or buffer underrun is instantly audible. You can use a headerless companding algorithm like A-law to fit more dynamic range in 8 bits. Even stereo works; if you start from the wrong byte the channels will just get swapped. SoX can also play back the stream.
But can we get more? Slowly I added more samples per second, and a second audio channel. Suddenly we were at 256 kbps and still running smoothly. 200 kbps was even possible from the adjacent room, with a directional antenna 5 meters away, and with the door closed! In the same room, it worked up to around 512 kilobits per second but then hit a wall.
A tearful performance
The heavy error correction and framing adds around 60% of overhead, and we're left wit 480 bits of 'payload' per packet. If we have 39 packets per frame at 60 frames per second we should get more than a megabit per second, right? But for some reason it always caps at half a megabit.
The reason revealed itself when I noticed every other frame was often completely discarded at the CRC check. Of course; I should have thought of properly synchronising the screen update to the graphics adapter's frame update cycle (or VSYNC). This would prevent the picture information changing mid-frame, also known as tearing. But whatever options I tried with the SDL library I couldn't get the Raspberry Pi 4 to not introduce tearing.
Screen tearing appears to be an unsolved problem plaguing the Raspberry Pi 4 specifically (see this Google search). I tried another mini computer, the Asus Tinker Board R2.0, but I couldn't get the graphics drivers to work properly. I then realised it was a mistake to have the packets run from top to bottom; any horizontal tearing will cut every single packet in half! With a horizontal design only one packet per frame would suffer this fate.
A new design enables video-over-video
Packets that run horizontally across the screen indeed fix most of the packet loss. It may also help with CPU load as it improves memory access locality. I'm now able to get 1000 kbps from the monitor! What could this be used for? A live video stream, perhaps?
But the clock was ticking. I had a presentation coming up and I really wanted to amaze everyone with a video transfer demo. I quite literally got it working on the morning of the event. For simplicity, I decided to go with MJPEG, even though fancier schemes could compress way more efficiently. The packet loss issues are mostly kept at bay by repeating frames.
The data stream is "hidden" in a Windows desktop screenshot; I'm changing the colours in a way that both creates a readable bit and also looks inconspicuous when you look from far away.
Mitigations
This was a fun project but this kind of a vulnerability could, in the tinfoiliest of situations, be used for exfiltrating information out of a supposedly airgapped computer.
The issue has been alleviated in some modern display protocols. DisplayPort[4] makes use of scrambling: a pseudorandom sequence of bits is mixed with the bitstream to remove the strong clock oscillations that are so easily radiated out. This also randomizes the bitstream-to-amplitude correlation. I haven't personally tested whether it still has some kind of video in their radio interference, though. (Edit: Scrambling seems to be optionally supported by later versions of HDMI, too – but it might depend on which features exactly the two devices negotiate. How could you know if it's turned on?)
I've also tried wrapping the monitor in tinfoil (very impractical) and inside a cage made out of chicken wire (it had no effect - perhaps I should have grounded it?). I can't recommend either of these.
Software considerations
This project was made possible by at least C++, Perl, SoX, ImageMagick, liquid-dsp, Dear Imgui, GLFW, turbojpeg, and v4l2! If you're a library that feels left out, please leave a comment.
If you wish to play around with video emanations, I heard there is a project called TempestSDR. For generic analog video decoding via a software radio, there is TVSharp.
References
- Van Eck, Wim (1985): Electromagnetic radiation from video display units: An eavesdropping risk?
- Kuhn, Markus (2004): Electromagnetic Eavesdropping Risks of Flat-Panel Displays
- KUNG FURY Official Movie [HD] (2015)
- Video Electronics Standards Association (2006): DisplayPort Standard, version 1.
Really interesting stuff. I have been making band-pass delta sigma modulation DACs in an FPGA a while back, having been able to transmit an FM modulated signal with enough bandwidth to match commercial stations, in the FM band, using just a single digital output pin toggling at 350 MHz makes me wonder if you can band-pass delta sigma modulate the pixels to get an RF signal to transmit well and do PSK or FSK through the HDMI port. It might also be possible to select pixel values cleverly to force the TMDS encoding of the HDMI 1.4 standard to output more/less 1s or 0s to get better amplitude control.
ReplyDeleteWhat software radio did you use?
ReplyDeleteHi, I used the Airspy R2.
DeleteReally fun to read, thanks for posting! As for the chicken wire, grounding helps, but the bigger problem is the size of the holes. They should be 10x smaller than your wavelength - then it won't get through.
ReplyDeleteThe holes were quite small, however I think it might be the cable that is radiating the signal. Or mostly the cable, at least. And it was left outside the cage.
DeleteI never got around to understanding the recent (ish) papers on the mathematical models of Faraday cages, but especially https://people.maths.ox.ac.uk/trefethen/faraday_published.pdf gives a nice insight into the relationship between not only the size of the holes, but the thickness of the wire. Perhaps it has little impact here, although industry maybe already uses this to minimise material waste when protecting consumers with their microwaves.
Deletenice
ReplyDeleteWhat frequency does the HDMI radiate on? I have a rpi4 with monitor driving my Ham Radio gear so I'm wondering about interference in the Ham bands in particular!
ReplyDeleteIt depends on your pixel clock frequency. For me, the strongest signal is around 424 MHz.
Deletei imagine the cabling is the strongest emitter, not the screen itself, so perhaps tinfoiling that part would be more practical in execution and usability
ReplyDeleteVery nice job!
ReplyDeleteIs Gempest v13.38 (tool and/or source code) available ?
No, but I've heard of a similar tool called TempestSDR that is open source.
DeleteWhere can I download gempest ?
ReplyDeletePerhaps this is the composite video that the Pi can output, somehow leaking out? Does the frequency that this is on change if you enable tvout?
ReplyDeleteProbably not, I've seen a similar signal from other computers as well
DeleteWindy,
ReplyDeleteYou mention, "scrambling: a pseudorandom sequence of bits is mixed with the bitstream to remove the strong clock oscillations that are so easily radiated out." also known as "whitening" or "Direct Sequence Spread Spectrum".
This trick is used to "get inside the mask" of EMC specs on amoungst other things Motherboards. I can tell you from experience that if you find out what the spreading code is and synchronize to it some real magic happens. Look up "Code Division Multiplexing" to see how you can pull out just one signal from many.
So what it means is the old trick of hiding a computer in the noise of many computers real does not work to a knowledgable attacker. They actually get not a very much decreased range as people mistakenly assume, but an increased range.
Also consider the use of multiple synchronised receivers using low gain directional antennas. The antennas give you noise free amplification, but if you set their position up correctly and phase the receivers correctly you get the equivalent of a "Very Long Baseline"(VLB) highly directional and higher gain antenna. It's a trick Radio Astronomers use, and there are a lot of "home observers" involved so there is software to set up your own set of receivers available.
I've mentioned all of this before over on the Schneier on Security blog a couple of times as I did with Markus on the LightBlueTouchPaper blog when he published his work.
Have fun, there are actually very few academic researchers working in the TEMPEST EmSec area, especially with the likes of "EM Active Fault Injection" attacks where you illuminate cables and ventilation slots with 10GHz and similar signals (remember slots are antennas just like dipoles) if you AM or PM modulate the signal it goes through the slot where intetnal wiring carries it forward to the IC I/O pads where the protection diodes act likr a simple AM product detector, thus dump the modulation signal directly on the I/O pin centered around the logic threshold point. If you look at the LightBlue blog you will find a couple of students there turned a very expensive IBM secure True Random Generator from 32bits down to less than 7bits, thus making guessing attacks trivial. Back last century I had a brief chat with Prof Ross J. Anderson about EM fault injection to show why trying to solve "Power Analysis" on Smart Cards using random clocked logic would not work. Back in the 1980's when I started my own independent investigations and found out how to do Fault Injection, it brought me to the attention of some "unhappy people" apparently in the US such knowledge was highly classified and their job eas to persuade me not to talk about it. Well I did talk, but back then nobody was realy interested. Oh and Van Eck was not the first to demonstrate this. The UK BBC back in thr 1970's had a program called "Tommorows World" which demonstrated the reconstruction of a VDU screen in 1976 if I rrmember correctly. It created quite a stink at the time then vanished as though it had never been aired (such was the power of DORA or "D-Notices).
The spooks realy loved their "Do not Play on our Grass" notices back then.
that's fascinating - I would love resources to learn more about this area of IT from a security research perspective if you have any.
Deleteaybabtu
ReplyDelete> But whatever options I tried with the SDL library I couldn't get the Raspberry Pi 4 to not introduce tearing.
ReplyDeleteI was able to find a trick; use timing tricks to beam race the tearline between refresh cycles.
Google "Tearline Jedi". You can use precise timing tricks to present the frame at a very exact time interval, to create raster-exact tearline positions.
If you can access the VSYNC signal (via any API) and/or can dejitter it (e.g. RefreshRateCalculator.js algorithm), you can most certainly beamrace tearlines on most platforms. The time offset between VSYNC's is the approximate raster scan line number the tearing will appear at.
If you don't have access to finding the VSYNC heartbeat, you could attempt a feedback loop (dongle wire tapping the video output to a GPIO -- use a HDMI splitter + HDMI-to-VGA adaptor + wire to the VSYNC pin of VGA + wire to GPIO of Pi) to listen to your Raspberry Pi's VSYNC output. And present your frames at exact intervals.
The other option is to try to use a precise 59.94Hz clock or 60Hz clock, derived from a ModeLine (pixel clock) -- try to pull your ModeLine from your Linux config to determine exact refresh rate. You may need to use a phase adjuster to guess the position of the tearline, or simply use a stationary-tearline-resistant version of your protocol. You may have to busyloop on RTDSC before presenting your frame, to keep tearline jitter as zeroed-out on your Pi.
Hi, do we know by now if displayport cables or optical hdmi cables help reduce the the leakage? There is a paper online where they reconstruct the screen of hdmi-connected monitor at 80 meters distance. Thats a lot.
ReplyDeleteHi, yes, DisplayPort is not vulnerable to this attack. The protocol uses scrambling to make the picture unreadable (in practical terms). However it might of course have other undiscovered vulnerabilities. Attacks against the scrambler could be a possibility but would take huge effort.
DeleteI saw https://www.researchgate.net/publication/352644553_Compromising_Electromagnetic_Radiation_of_Information_Displays. In figure 8.7 of chapter 8, they give a screen of recovered signal from DP, while it looks worse because of merged partial images resulted from DP parallelism, it still seems some parts of the image are still visible (despite scrambling?). The description says its from 2 meter distance. I also wonder if the scrambling signals could "average out" when recording a still image for a while?
DeleteCould this help: https://www.amazon.com/-/de/dp/B09KZK1QV2/ref=sr_1_6 or https://www.amazon.com/-/de/dp/B098QDRGWJ/ref=sr_1_15 ? The optical displayport cables refer to also make use of copper, though. Can‘t find what data signal uses which fiber.
ReplyDelete