The Speed of Light, Caught on Film
December 13, 2011 7:31 AM Subscribe
Capturing light in motion at a trillionth of a second. MIT, using a new technique called Femto Photography, consisting of femtosecond laser illumination, picosecond-accurate detectors and mathematical reconstruction techniques, has captured the movement of pulses of light.
Holy mackerel, roll down to the video of the individual laser pulse moving through the plastic bottle.
posted by jquinby at 7:37 AM on December 13, 2011 [2 favorites]
posted by jquinby at 7:37 AM on December 13, 2011 [2 favorites]
physics has always amazed my at how they roll their own measurement devices and deal with noise, interference etc. After struggling with input impedance on an oscilloscope, I'm still dubious about the precision, but impressed nonetheless.
posted by k5.user at 7:41 AM on December 13, 2011
posted by k5.user at 7:41 AM on December 13, 2011
reflectected, milimeter, billiongth... faster than the speed of spellcheck!
posted by oulipian at 7:44 AM on December 13, 2011 [1 favorite]
posted by oulipian at 7:44 AM on December 13, 2011 [1 favorite]
Through a system of mirrors, we...
I just knew it.
posted by orme at 7:48 AM on December 13, 2011 [4 favorites]
I just knew it.
posted by orme at 7:48 AM on December 13, 2011 [4 favorites]
One of the things that's messing with my head here is how far the camera is from the subject. I mean, we seem to be at least a bottle's length away from the bottle, right? So in this reconstruction, if the light is moving at roughly the same speed through the bottle as through the air, then when we see it hit the base of the bottle, it's actually already hit the cap ... and by the time we see it hit the cap, it's actually already dark again.
In other words, this is kind of like the "time travel" of looking up at light from a star that burned out a million years ago ... except we're only six inches away from an object that is no longer actually emitting light for the entire time we see it. That is nuts.
(I don't speak physics, so I'm probably using the words "moving", "hitting", "time", "nuts" etc. all wrong. It's still crazy nuts by my definition.)
posted by Honorable John at 7:50 AM on December 13, 2011 [3 favorites]
In other words, this is kind of like the "time travel" of looking up at light from a star that burned out a million years ago ... except we're only six inches away from an object that is no longer actually emitting light for the entire time we see it. That is nuts.
(I don't speak physics, so I'm probably using the words "moving", "hitting", "time", "nuts" etc. all wrong. It's still crazy nuts by my definition.)
posted by Honorable John at 7:50 AM on December 13, 2011 [3 favorites]
Well, you know what the old joke, the person with more than 10 items in the express lane at a Cambridge grocery is either from MIT and can't read, or Harvard and can't count.
The bottle is definitely the most easily understood visualization of what one might describe as the representation of an object at light speed "captured" on video.
posted by dglynn at 7:55 AM on December 13, 2011
The bottle is definitely the most easily understood visualization of what one might describe as the representation of an object at light speed "captured" on video.
posted by dglynn at 7:55 AM on December 13, 2011
We already do aperture synthesis with radio telescopes. But for optical, it's a big hassle:
TLDR: Get ready for pictures of beings living on earth-like planets in other solar systems.
posted by DU at 7:57 AM on December 13, 2011 [3 favorites]
Aperture synthesis is possible only if both the amplitude and the phase of the incoming signal is measured by each telescope. For radio frequencies, this is possible by electronics, while for optical lights, the elecromagnetic field cannot be measured directly and correlated in software, but must be propagated by sensitive optics and interfered optically. Accurate optical delay and atmospheric wavefront aberration correction is required, a very demanding technology which became possible only in the 1990s. This is why imaging with aperture synthesis has been used successfully in radio astronomy since the 1950s and in optical/infrared astronomy only since the 2000 decade. See astronomical interferometer for more information.Those "sensitive optics" mean your multiple sensors have to be pretty close together because they are physically linked. But now that we can accurately time photon arrival, you could save the input from each optical telescope separately. The resolution of telescope is a function of the (apparent) diameter of the "lens", so if you put two satellites in orbit offset by 180°, you'd have a telescope bigger than the Earth, which, if I've done my math right, means an angular resolution of GIGGIDY CROW.
TLDR: Get ready for pictures of beings living on earth-like planets in other solar systems.
posted by DU at 7:57 AM on December 13, 2011 [3 favorites]
Can you capture any event at this frame rate? What are the limitations?
We can NOT capture arbitrary events at picosecond time resolution. If the event is not repeatable, the required signal to noise ratio will make it nearly impossible to capture the event.
Pfff, nevermind.
posted by DU at 8:01 AM on December 13, 2011 [2 favorites]
We can NOT capture arbitrary events at picosecond time resolution. If the event is not repeatable, the required signal to noise ratio will make it nearly impossible to capture the event.
Pfff, nevermind.
posted by DU at 8:01 AM on December 13, 2011 [2 favorites]
It is a wave! Not a particle! I knew it!
posted by dances_with_sneetches at 8:02 AM on December 13, 2011 [1 favorite]
posted by dances_with_sneetches at 8:02 AM on December 13, 2011 [1 favorite]
DU: Whoa there. First of all, this technique doesn't seem to capture any phase information (I don't have access to the paper, I'm just going by their summary). Second of all, even if you could use interferometry to make a visible-light telescope with an effective 13,000 km aperture, it still wouldn't have enough resolution to make out an object smaller than about a mile at the distance of Proxima Centauri, even under ideal conditions.
That said, this is an awesomely cool piece of apparatus.
posted by teraflop at 8:07 AM on December 13, 2011
That said, this is an awesomely cool piece of apparatus.
posted by teraflop at 8:07 AM on December 13, 2011
If you notice how the tomato and the bottle cap keep glowing for a while, it's because these are translucent objects and the photons are just kind of "fucking around" in there, bouncing around for a while before they can get out and back to the detector.
Skin would be the same way, so honestly I don't know how they're going to do any kind of medical imaging with this. But I would never have dreamed up MRI, either.
posted by seanmpuckett at 8:07 AM on December 13, 2011 [4 favorites]
Skin would be the same way, so honestly I don't know how they're going to do any kind of medical imaging with this. But I would never have dreamed up MRI, either.
posted by seanmpuckett at 8:07 AM on December 13, 2011 [4 favorites]
This music sounds like the mission loading screen of squad based tactical first person shooter.
By some coincidence, I chaired the SIGGRAPH technical session where this research was presented. The researcher played the bottle video over and over, each time from beginning to end, and each time with that music absolutely blasting. A member of the audience finally approached one of the mics to ask him to turn it down.
I feel slightly embarrassed. That should have been my job.
posted by rlk at 8:13 AM on December 13, 2011 [4 favorites]
By some coincidence, I chaired the SIGGRAPH technical session where this research was presented. The researcher played the bottle video over and over, each time from beginning to end, and each time with that music absolutely blasting. A member of the audience finally approached one of the mics to ask him to turn it down.
I feel slightly embarrassed. That should have been my job.
posted by rlk at 8:13 AM on December 13, 2011 [4 favorites]
Ooooh, pretty. Now do particle.
(I was actually considering how that would look, but chances are if you released a single photon, it wouldn't hit the camera, and even if it did, you'd just see a single bright pixel. I wonder if you could release single photons at a slow enough rate that you'd see a pixelated wavefront...)
posted by Popular Ethics at 8:18 AM on December 13, 2011 [1 favorite]
(I was actually considering how that would look, but chances are if you released a single photon, it wouldn't hit the camera, and even if it did, you'd just see a single bright pixel. I wonder if you could release single photons at a slow enough rate that you'd see a pixelated wavefront...)
posted by Popular Ethics at 8:18 AM on December 13, 2011 [1 favorite]
I'm trying to get my head around the bottle video. What are we actually seeing in that video?
Naively, it seems to me like photons are being emitted from the main body of photons that we see traveling, and hitting the camera. They're being emitted (more or less) perpendicularly to the direction of travel of the main body.
Is that correct? If so, why does it happen? Because some of the photons in the main body hit air particles, and are deflected? So if this were done in a vacuum, we wouldn't see anything except when the main body of photons hits the bottle and when it exits the bottle?
How many photons are in a main body of photons like that? And how many of them are we actually seeing (i.e. how many leave the main body and hit the camera)?
The other possible that occurred to me is a hand-wavey "No, you're seeing the original emission as a wave hitting the camera", which doesn't seem realistic. That body of photons sure seems localized to a certain spot, as if light were being emitted from it specifically as it travels.
posted by Flunkie at 8:32 AM on December 13, 2011 [1 favorite]
Naively, it seems to me like photons are being emitted from the main body of photons that we see traveling, and hitting the camera. They're being emitted (more or less) perpendicularly to the direction of travel of the main body.
Is that correct? If so, why does it happen? Because some of the photons in the main body hit air particles, and are deflected? So if this were done in a vacuum, we wouldn't see anything except when the main body of photons hits the bottle and when it exits the bottle?
How many photons are in a main body of photons like that? And how many of them are we actually seeing (i.e. how many leave the main body and hit the camera)?
The other possible that occurred to me is a hand-wavey "No, you're seeing the original emission as a wave hitting the camera", which doesn't seem realistic. That body of photons sure seems localized to a certain spot, as if light were being emitted from it specifically as it travels.
posted by Flunkie at 8:32 AM on December 13, 2011 [1 favorite]
Flunkie, I've been trying to put into words exactly the same thoughts for about an hour now. Physics is hard and it makes my brains hurt.
They make reference to this as a visualization so I wonder if the equipment detected the wave and they used some other tools to show what that wave would look like...or something...I need to lay down.
posted by VTX at 8:54 AM on December 13, 2011
They make reference to this as a visualization so I wonder if the equipment detected the wave and they used some other tools to show what that wave would look like...or something...I need to lay down.
posted by VTX at 8:54 AM on December 13, 2011
Flunkie Is that correct? If so, why does it happen? Because some of the photons in the main body hit air particles, and are deflected? So if this were done in a vacuum, we wouldn't see anything except when the main body of photons hits the bottle and when it exits the bottle?
That's my understanding too. AFAIK, the experimenters emitted an *extremely* short burst of photons at some frequency "F", and they set the camera with a "frame rate" of F + a *tiny* little bit "dF", so that each frame captured the next group of photos (and the reflections from earlier groups) advanced in time by dF. It appears to be similar to how an oscilloscope works, or that illusion where the spokes of a wheel turn backwards.
somebody correct me if I'm wrong
posted by Popular Ethics at 9:04 AM on December 13, 2011 [1 favorite]
That's my understanding too. AFAIK, the experimenters emitted an *extremely* short burst of photons at some frequency "F", and they set the camera with a "frame rate" of F + a *tiny* little bit "dF", so that each frame captured the next group of photos (and the reflections from earlier groups) advanced in time by dF. It appears to be similar to how an oscilloscope works, or that illusion where the spokes of a wheel turn backwards.
somebody correct me if I'm wrong
posted by Popular Ethics at 9:04 AM on December 13, 2011 [1 favorite]
OK, as far as I can tell, what they're doing is combining some sort of lightning fast shutter system with a stuttering laser pulse, timed to catch one pulse on the frame at any given time. They shoot thousands if not millions of frames of that, then pull out specific frames and re-arrange them to illustrate what the path of a single pulse would look like. Since all the pulses are identical, this is pretty cromulent. Then, since the actual shooting is B&W by necessity because of the low light, I think they go back in and overlay the color from a still photo by computer interpolation.
It is definitely a visualization or reconstruction rather than just a high-speed shot -- you could never use this technique to visualize water droplets or a bullet going through an apple, because you can't do those billions of times and have every one be identical. But man alive it is COOL AS FUCK. Great post, thanks.
posted by KathrynT at 9:19 AM on December 13, 2011 [2 favorites]
It is definitely a visualization or reconstruction rather than just a high-speed shot -- you could never use this technique to visualize water droplets or a bullet going through an apple, because you can't do those billions of times and have every one be identical. But man alive it is COOL AS FUCK. Great post, thanks.
posted by KathrynT at 9:19 AM on December 13, 2011 [2 favorites]
Let me continue the explanation a bit: If you set the frequency of an oscilloscope a little bit longer than the wave you're measuring, you'll see the wave move from the left of the screen to the right, as if you were watching a slowed-time movie of the wave's movement. Similarly, the spoke illusion appears to show you a time-reversed movie of the wheel's movement. By playing around with the frequency of your camera w.r.t. the frequency of the scene, you can slow, stop or reverse the motion of the objects in the scene. As DU cited, this technique only works for scenes which repeat themselves exactly.
There's another trick happening here that I haven't quite grasped. They state in their abstract that a very short burst of photons would not have enough energy to expose a camera sensor, so it appears that they recorded the sequence many times and added the matching frames together to get the final movie. I'm not sure exactly how they did this in their setup... back to the article!
posted by Popular Ethics at 9:21 AM on December 13, 2011
There's another trick happening here that I haven't quite grasped. They state in their abstract that a very short burst of photons would not have enough energy to expose a camera sensor, so it appears that they recorded the sequence many times and added the matching frames together to get the final movie. I'm not sure exactly how they did this in their setup... back to the article!
posted by Popular Ethics at 9:21 AM on December 13, 2011
teraflop: "DU: Whoa there. First of all, this technique doesn't seem to capture any phase information (I don't have access to the paper, I'm just going by their summary). Second of all, even if you could use interferometry to make a visible-light telescope with an effective 13,000 km aperture, it still wouldn't have enough resolution to make out an object smaller than about a mile at the distance of Proxima Centauri, even under ideal conditions.
That said, this is an awesomely cool piece of apparatus."
Still enough to find spacewhales, and that's all I'm asking for.
posted by symbioid at 9:24 AM on December 13, 2011 [1 favorite]
That said, this is an awesomely cool piece of apparatus."
Still enough to find spacewhales, and that's all I'm asking for.
posted by symbioid at 9:24 AM on December 13, 2011 [1 favorite]
Naively, it seems to me like photons are being emitted from the main body of photons that we see traveling, and hitting the camera. They're being emitted (more or less) perpendicularly to the direction of travel of the main body.
Is that correct? If so, why does it happen? Because some of the photons in the main body hit air particles, and are deflected? So if this were done in a vacuum, we wouldn't see anything except when the main body of photons hits the bottle and when it exits the bottle?
The bottle is full of water. Just like the prior videos, you're not seeing the 'pulse' or 'wave' directly, you're seeing the photons that refract off of various other particles at an angle visible to the camera. With the apple vids, you see the wave as it collides with the table, and the apple, but you don't see a shimmering curtain hanging in the air or anything weird like that. With the bottle, you're seeing the 'pulse' by way of the photons that are refracting due to the water.
posted by FatherDagon at 9:37 AM on December 13, 2011 [1 favorite]
Is that correct? If so, why does it happen? Because some of the photons in the main body hit air particles, and are deflected? So if this were done in a vacuum, we wouldn't see anything except when the main body of photons hits the bottle and when it exits the bottle?
The bottle is full of water. Just like the prior videos, you're not seeing the 'pulse' or 'wave' directly, you're seeing the photons that refract off of various other particles at an angle visible to the camera. With the apple vids, you see the wave as it collides with the table, and the apple, but you don't see a shimmering curtain hanging in the air or anything weird like that. With the bottle, you're seeing the 'pulse' by way of the photons that are refracting due to the water.
posted by FatherDagon at 9:37 AM on December 13, 2011 [1 favorite]
At first glance, I think you're right, Popular Ethics. It looks like they're doing a variant of the trick that is used (for example) in the water piddler that sits right outside my office. Order of operations seems to be:
1) Start laser, with regular (every 13ns) pulses at target.
2) Use laser pulse to trigger super-fast sensor (streak tube), record 1 line of 1 frame
3) Mechanically (w/ mirrors), scan field of view up across entire vertical field
4) Assemble results of 2 & 3 into a single frame
5) Increase delay between pulse and sensor (you could do this by e.g. moving the triggering sensor half a mm each time)
6) Repeat process to create "next" frame in the movie; assemble into BW movie (which is shown in the middle of all their videos)
7) Use the BW movie as an overlay (affecting luminosity? not my department) onto the still image, mostly so you can see a super-cool color image (and partly, I assume, to conceal how low-res the source data actually is).
(Note that steps 3-6 could be done in a lot of different orders; there's not enough info on the web site to figure that out.)
The beauty of this method is that very little actually needs to be fast -- the trigger sensor and the streak tube, but that's it. The bottleneck in most high-speed photography is the massive flood of data when you try to grab whole frames at once. With this you can just wait patiently until the rest of the equipment catches up, then grab the next line.
In my lab we've done versions of this same experiment, except with bullets -- way more fun with light, especially given that light simplifies things a little because it always moves at the same speed...
Plus you don't have to constantly take on and off your hearing protection with light and don't get me started about getting little pieces of apple in your mouth
posted by range at 9:47 AM on December 13, 2011 [2 favorites]
1) Start laser, with regular (every 13ns) pulses at target.
2) Use laser pulse to trigger super-fast sensor (streak tube), record 1 line of 1 frame
3) Mechanically (w/ mirrors), scan field of view up across entire vertical field
4) Assemble results of 2 & 3 into a single frame
5) Increase delay between pulse and sensor (you could do this by e.g. moving the triggering sensor half a mm each time)
6) Repeat process to create "next" frame in the movie; assemble into BW movie (which is shown in the middle of all their videos)
7) Use the BW movie as an overlay (affecting luminosity? not my department) onto the still image, mostly so you can see a super-cool color image (and partly, I assume, to conceal how low-res the source data actually is).
(Note that steps 3-6 could be done in a lot of different orders; there's not enough info on the web site to figure that out.)
The beauty of this method is that very little actually needs to be fast -- the trigger sensor and the streak tube, but that's it. The bottleneck in most high-speed photography is the massive flood of data when you try to grab whole frames at once. With this you can just wait patiently until the rest of the equipment catches up, then grab the next line.
In my lab we've done versions of this same experiment, except with bullets -- way more fun with light, especially given that light simplifies things a little because it always moves at the same speed...
Plus you don't have to constantly take on and off your hearing protection with light and don't get me started about getting little pieces of apple in your mouth
posted by range at 9:47 AM on December 13, 2011 [2 favorites]
How does the streak tube work? One explanation says that the streak tube works by deflecting photons in an electric field, but that can't be right, can it? Photons don't have charge, so how to deflect? Something prism-like, I'm guessing.
posted by whuppy at 10:19 AM on December 13, 2011
posted by whuppy at 10:19 AM on December 13, 2011
Well, this is arguably the coolest physics demo I've seen.
You can actually see the wave packet from the laser pulse traveling through the coke bottle, and then being reflected and absorbed into the bottle cap.
Range seems to be correct as to the implementation of this system, and I'm bloody impressed that they got such good results considering just how bright, and just how targeted these snapshots must need to be. Photons are a whole lot less numerous than you might think (especially when you're only capturing 1 trillionth of a second worth of reflected light). and the human eye can detect as few as 90 photons, although that's not remotely enough to illuminate an entire scene, and current electronic image sensors aren't quite that sensitive.
Suppose we're lighting a black room with a 100W light bulb, and want to take a photo of our light bulb. Pretty easy, right?
A 100W light bulb radiates about 2.5e19 photons (that's 25 with 18 zeros on it) in all directions per second. That's an awful lot.
However, only a small fraction of those photons actually reach the camera. Let's say the camera is 15 feet from the bulb. To figure out how many photons make it into the camera, we imagine a sphere with the light bulb at the center, and the camera stuck to one of its walls. A normal DSLR image sensor is about 330mm2, while our sphere's surface area is 314m2.
Only .001% of the light from the bulb actually reaches the camera. (However, reflected light brings that number up somewhat. For the sake of this argument, the room is black, and there are no reflections). In the end, 26 trillion photons make it into the camera each second.
However, nobody takes photos with a 1-second exposure, so we'll assume a "normal" 1/250sec shutter speed, and conclude that 100 billion photons entered our camera's sensor to produce our artistic photo of a 100W light bulb in a black room. This is ignoring any effects that the camera's lens or other optics might have.
Now, let's try to take that photo with a 1-Trillionth Sec shutter speed, and only 26 photons make it to the sensor. That's barely enough for the camera to detect, and certainly doesn't provide us with enough information to produce an image.
Worse still, we're using a best-case scenario, in which we're staring directly into the light. To capture a useful image, you really need to collect photons that have been reflected off of the object you want to photograph.
If I'm only collecting 0.001% of the output of a light bulb that I am staring directly into, I'm capturing even less reflected light off of whatever object that light's illuminating.
If we put an object next to the light source in the scenario above, I don't think it's unreasonable to conclude that you might not capture a single reflected photon off of that object in a trillionth-second exposure.
Even worse still, this group were trying to visualize laser pulses. The only way you can see a laser beam through the air is for the laser to be scattered and reflected off of molecules and particles in the air. A very bright laser is only going to produce a fairly dim "beam" when viewed from the side.
Basically, what I'm saying is that you need a ridiculously bright light source to produce an image using this technique, and also a very sensitive sensor.
posted by schmod at 10:41 AM on December 13, 2011 [7 favorites]
You can actually see the wave packet from the laser pulse traveling through the coke bottle, and then being reflected and absorbed into the bottle cap.
Range seems to be correct as to the implementation of this system, and I'm bloody impressed that they got such good results considering just how bright, and just how targeted these snapshots must need to be. Photons are a whole lot less numerous than you might think (especially when you're only capturing 1 trillionth of a second worth of reflected light). and the human eye can detect as few as 90 photons, although that's not remotely enough to illuminate an entire scene, and current electronic image sensors aren't quite that sensitive.
Suppose we're lighting a black room with a 100W light bulb, and want to take a photo of our light bulb. Pretty easy, right?
A 100W light bulb radiates about 2.5e19 photons (that's 25 with 18 zeros on it) in all directions per second. That's an awful lot.
However, only a small fraction of those photons actually reach the camera. Let's say the camera is 15 feet from the bulb. To figure out how many photons make it into the camera, we imagine a sphere with the light bulb at the center, and the camera stuck to one of its walls. A normal DSLR image sensor is about 330mm2, while our sphere's surface area is 314m2.
Only .001% of the light from the bulb actually reaches the camera. (However, reflected light brings that number up somewhat. For the sake of this argument, the room is black, and there are no reflections). In the end, 26 trillion photons make it into the camera each second.
However, nobody takes photos with a 1-second exposure, so we'll assume a "normal" 1/250sec shutter speed, and conclude that 100 billion photons entered our camera's sensor to produce our artistic photo of a 100W light bulb in a black room. This is ignoring any effects that the camera's lens or other optics might have.
Now, let's try to take that photo with a 1-Trillionth Sec shutter speed, and only 26 photons make it to the sensor. That's barely enough for the camera to detect, and certainly doesn't provide us with enough information to produce an image.
Worse still, we're using a best-case scenario, in which we're staring directly into the light. To capture a useful image, you really need to collect photons that have been reflected off of the object you want to photograph.
If I'm only collecting 0.001% of the output of a light bulb that I am staring directly into, I'm capturing even less reflected light off of whatever object that light's illuminating.
If we put an object next to the light source in the scenario above, I don't think it's unreasonable to conclude that you might not capture a single reflected photon off of that object in a trillionth-second exposure.
Even worse still, this group were trying to visualize laser pulses. The only way you can see a laser beam through the air is for the laser to be scattered and reflected off of molecules and particles in the air. A very bright laser is only going to produce a fairly dim "beam" when viewed from the side.
Basically, what I'm saying is that you need a ridiculously bright light source to produce an image using this technique, and also a very sensitive sensor.
posted by schmod at 10:41 AM on December 13, 2011 [7 favorites]
Schmod: I think Range worked this out: With the appropriate fast shutter, you can expose the sensor to repeated images of the same* 26 photons until they add up to the 100 billion you need to get the right exposure to record the image, then you move on to the next frame (or line, or pixel or whatever).
*the same, as in a repeats of the same time step.
posted by Popular Ethics at 11:32 AM on December 13, 2011
*the same, as in a repeats of the same time step.
posted by Popular Ethics at 11:32 AM on December 13, 2011
It looks as if they have edited the video to remove the Coca-Cola logo from the bottle. The logo is still visible on the video preview image, and on an image on the BBC article about this. I find this a little curious; you'd think that they would be concerned that editing the video in this way would compromise the integrity of the results they are presenting to the public.
posted by oulipian at 11:55 AM on December 13, 2011
posted by oulipian at 11:55 AM on December 13, 2011
They might be using something similar to lucky imaging and capturing multiple exposures, although I think that the streak tube might be sensitive enough (and the laser bright enough) that they can skip that step. Basically, I think it's more analogous to a flatbed scanner, or scanning back.
I'm honestly unfamiliar with streak tubes, and google's failing to enlighten me.
posted by schmod at 12:03 PM on December 13, 2011
I'm honestly unfamiliar with streak tubes, and google's failing to enlighten me.
posted by schmod at 12:03 PM on December 13, 2011
... you'd think that they would be concerned that editing the video in this way would compromise the integrity of the results...
My guess is that from their perspective the video is so heavily tampered-with that removing a logo is completely un-remarkable. (And really is the least of what they've done, although it is the most obvious.)
posted by range at 12:43 PM on December 13, 2011
My guess is that from their perspective the video is so heavily tampered-with that removing a logo is completely un-remarkable. (And really is the least of what they've done, although it is the most obvious.)
posted by range at 12:43 PM on December 13, 2011
As a CG lighting artist, this is basically pornography to me.
"Oh yeah, you're not so coherent now that you're inside my participating media, are you, little laser?"
posted by balistic at 2:03 PM on December 13, 2011
"Oh yeah, you're not so coherent now that you're inside my participating media, are you, little laser?"
posted by balistic at 2:03 PM on December 13, 2011
How does the streak tube work? One explanation says that the streak tube works by deflecting photons in an electric field, but that can't be right, can it? Photons don't have charge, so how to deflect?
I was wondering this too. From the wikipedia article linked to by the trillionfps article, it sounds like the photons themselves aren't deflected. The image falls on a photocathode, and the emitted electrons are focused and deflected in such a way as to get the appropriate slice of image. It says this technique can get a temporal resolution of 100 fs, which is orders of magnitude better than what these images require, so it's probably what they're using.
There are some electro-optical effects like the Kerr, Faraday, and Pockels effects that are used for high speed manipulation of light (including earlier high-speed photography— taking pictures of a nuclear bomb is one way to solve the illumination problem, I guess…)
posted by hattifattener at 10:04 PM on December 13, 2011 [1 favorite]
I was wondering this too. From the wikipedia article linked to by the trillionfps article, it sounds like the photons themselves aren't deflected. The image falls on a photocathode, and the emitted electrons are focused and deflected in such a way as to get the appropriate slice of image. It says this technique can get a temporal resolution of 100 fs, which is orders of magnitude better than what these images require, so it's probably what they're using.
There are some electro-optical effects like the Kerr, Faraday, and Pockels effects that are used for high speed manipulation of light (including earlier high-speed photography— taking pictures of a nuclear bomb is one way to solve the illumination problem, I guess…)
posted by hattifattener at 10:04 PM on December 13, 2011 [1 favorite]
« Older Forever Analytical | Abandoned Disneyland in Beijing Newer »
This thread has been archived and is closed to new comments
posted by nathancaswell at 7:35 AM on December 13, 2011 [4 favorites]