Which comes first, the pixel or the ping?
May 1, 2012 5:34 PM Subscribe
On a desktop computer, which is faster: sending an IP packet 3,000 miles across the Atlantic ocean, or sending a pixel a couple of feet to your monitor? According to Johh Carmack, the IP packet arrives first.
Carmack is the well know and highly respected programmer of classic PC games such as Doom and Quake, and the co-founder of id Software, one of the first PC Game companies. The surprising result comes from significant delays in processing the user's input, and the time it takes for a monitor to actually draw a pixel.
Carmack is the well know and highly respected programmer of classic PC games such as Doom and Quake, and the co-founder of id Software, one of the first PC Game companies. The surprising result comes from significant delays in processing the user's input, and the time it takes for a monitor to actually draw a pixel.
That does not surprise me. The fact is that display technology has not come anywhere as far as circuits and packet communications, because of processing speeds. No display has the broadband and system strength to draw a pixel in the same time frame as a packet going and coming. I am sure that you could time pixel drawing somehow and find the difference if you wanted to.
posted by parmanparman at 5:48 PM on May 1, 2012
posted by parmanparman at 5:48 PM on May 1, 2012
he man is crazy about response time.
fucking lpb
posted by nathancaswell at 5:48 PM on May 1, 2012 [17 favorites]
fucking lpb
posted by nathancaswell at 5:48 PM on May 1, 2012 [17 favorites]
Imagine if he was your father. "Put your clothes away!" ... "I told you to put your clothes away three nanoseconds ago, get a move on!"
posted by greenhornet at 5:49 PM on May 1, 2012 [3 favorites]
posted by greenhornet at 5:49 PM on May 1, 2012 [3 favorites]
The cool thing about this is John Carmack popped up on SuperUser just to answer this question.
Low latency displays sounds like the kind of thing Apple should work on. Both for games on iOS devices and just a computer that feels more responsive.
posted by Nelson at 5:59 PM on May 1, 2012
Low latency displays sounds like the kind of thing Apple should work on. Both for games on iOS devices and just a computer that feels more responsive.
posted by Nelson at 5:59 PM on May 1, 2012
I knew I was living in the future when I realized it was easier for me to stream data via Netflix from a server hundreds of miles away than to walk across the room and put a disc in my DVD player.
(I guess that's only tangentially related, but still)
posted by dirigibleman at 6:01 PM on May 1, 2012 [25 favorites]
(I guess that's only tangentially related, but still)
posted by dirigibleman at 6:01 PM on May 1, 2012 [25 favorites]
Thanks to integration of graphical effects into OS windowing systems, and the widespread adoption of flat panel monitors, the latency of screen display has actually been getting worse in the last decade.
posted by idiopath at 6:03 PM on May 1, 2012 [5 favorites]
posted by idiopath at 6:03 PM on May 1, 2012 [5 favorites]
sending a pixel a couple of feet to your monitor
....result comes from significant delays in processing the user's input
Sending a pixel to a monitor doesn't require user input.
posted by DU at 6:05 PM on May 1, 2012 [6 favorites]
....result comes from significant delays in processing the user's input
Sending a pixel to a monitor doesn't require user input.
posted by DU at 6:05 PM on May 1, 2012 [6 favorites]
The fact is that display technology has not come anywhere as far as circuits and packet communications, because of processing speeds. No display has the broadband and system strength to draw a pixel in the same time frame as a packet going and coming.
This is nonsense.
posted by grouse at 6:08 PM on May 1, 2012 [9 favorites]
This is nonsense.
posted by grouse at 6:08 PM on May 1, 2012 [9 favorites]
One experience that's quite startling is a window that's split between a LCD and a CRT monitor. Grab the title bar and move the window around. You'll see that the LCD lags WAAAYYY behind the CRT in updating the position of the window.
posted by jepler at 6:10 PM on May 1, 2012 [3 favorites]
posted by jepler at 6:10 PM on May 1, 2012 [3 favorites]
The fact is that display technology has not come anywhere as far as circuits and packet communications, because of processing speeds.
I don't think he measured that. Seems like he just assumed a "best case" scenario of 70ms travel time based on the speed of light, and found some monitors that appeared to take longer than that to update their screen based on input from a game controller.
posted by ShutterBun at 6:13 PM on May 1, 2012
I don't think he measured that. Seems like he just assumed a "best case" scenario of 70ms travel time based on the speed of light, and found some monitors that appeared to take longer than that to update their screen based on input from a game controller.
posted by ShutterBun at 6:13 PM on May 1, 2012
Furthermore, he actually attributes the slowness to bad driver software. "Unfortunate but fixable" he calls it.
posted by DU at 6:14 PM on May 1, 2012
posted by DU at 6:14 PM on May 1, 2012
One experience that's quite startling is a window that's split between a LCD and a CRT monitor. Grab the title bar and move the window around. You'll see that the LCD lags WAAAYYY behind the CRT in updating the position of the window.
Sure enough
posted by ShutterBun at 6:17 PM on May 1, 2012
Sure enough
posted by ShutterBun at 6:17 PM on May 1, 2012
....result comes from significant delays in processing the user's input
Sending a pixel to a monitor doesn't require user input.
Well, crap. When I wrote this post there were a number of comments on Carmack's answer from a USB engineering talking about the latency on the controller side, and how bad that was, too. Looks like a SuperUser mod deleted those comments. Ah, the joys of the Living Web.
posted by Frayed Knot at 6:18 PM on May 1, 2012
Sending a pixel to a monitor doesn't require user input.
Well, crap. When I wrote this post there were a number of comments on Carmack's answer from a USB engineering talking about the latency on the controller side, and how bad that was, too. Looks like a SuperUser mod deleted those comments. Ah, the joys of the Living Web.
posted by Frayed Knot at 6:18 PM on May 1, 2012
Furthermore, he actually attributes the slowness to bad driver software. "Unfortunate but fixable" he calls it.
No, he's blaming it on the software (firmware) running on the the device itself, not the driver on his computer.
posted by aubilenon at 6:20 PM on May 1, 2012 [1 favorite]
The situation in the 80s was really quite a lot better. With a system like the Commodore 64 you really could light up any particular pixel in 1/60s (16.7ms) maximum, 1/120s (8.3ms) average. If you just wanted to light up anywhere on the screen, it must have been somewhere in the realm of 1.4ms maximum (if the beam just left the displayable area, it's got 22 non-displaying rows at about 63.5µs each before the first displayable row of the next field).
posted by jepler at 6:21 PM on May 1, 2012 [1 favorite]
posted by jepler at 6:21 PM on May 1, 2012 [1 favorite]
Here is the rub- a 1080p screen is 1920x1080 pixels at 32 bits per pixel, which is 8.1 megabytes, 60 times per second. Try sending that amount of data around the world and see if it beats the LCD.
It is wide pipes versus fast pipes.
posted by gjc at 6:22 PM on May 1, 2012 [11 favorites]
It is wide pipes versus fast pipes.
posted by gjc at 6:22 PM on May 1, 2012 [11 favorites]
Both for games on iOS devices and just a computer that feels more responsive.
Folks can rest assured that Apple's engineers are well aware of this issue because iOS is already engineered to be very responsive, at least compared with other options. As iOS developers know, the operating system was written from the ground up so that UI events are given first priority: You tap the screen and that event is processed in a pretty remarkable amount of time — not just handling the tap, but also recalculating and redrawing the screen. In fact, so much optimization has been done here that trying to force UI updates onto a background thread is undefined behavior, and will often break an app's interface.
posted by Blazecock Pileon at 6:22 PM on May 1, 2012 [1 favorite]
Folks can rest assured that Apple's engineers are well aware of this issue because iOS is already engineered to be very responsive, at least compared with other options. As iOS developers know, the operating system was written from the ground up so that UI events are given first priority: You tap the screen and that event is processed in a pretty remarkable amount of time — not just handling the tap, but also recalculating and redrawing the screen. In fact, so much optimization has been done here that trying to force UI updates onto a background thread is undefined behavior, and will often break an app's interface.
posted by Blazecock Pileon at 6:22 PM on May 1, 2012 [1 favorite]
No, he's blaming it on the software (firmware) running on the the device itself, not the driver on his computer.
Well, I'm using "driver" here to mean "software below the level of OS" although I guess strictly speaking that software isn't even on the CPU. But the point is, it's a software problem, not a hardware one.
posted by DU at 6:26 PM on May 1, 2012
Well, I'm using "driver" here to mean "software below the level of OS" although I guess strictly speaking that software isn't even on the CPU. But the point is, it's a software problem, not a hardware one.
posted by DU at 6:26 PM on May 1, 2012
Did you read the fine article, Blazecock Pileon? We're talking about LCD lag. AFAIK Apple's devices are just as vulneable to that as anyone else's, although maybe they're a bit better than the usual consumer Windows hardware junk. I don't know, maybe someone's measured. Anyway, we're not talking here about software frameworks for handling events and display updates.
posted by Nelson at 6:27 PM on May 1, 2012
posted by Nelson at 6:27 PM on May 1, 2012
Also, it's too bad about the latency and all, but IMO it's way more important that things be crisp and undistorted. Simply drawing straight horizontal lines on a relatively flat CRT screen that's not also stupidly deep and heavy is like a heroic feat of engineering.
gjc: This is not the issue at all. The original wording is a little misleading. The data gets to the monitor plenty quick. It just takes the (specific) monitor all day to actually show you the update, once it's gotten it.
LCDs are always slower than CRTs, but it's not like your 1 GB/s dual DVI cable has 70 megs of data in transit at a time. That's all sitting in buffers on the monitor, or it's the pixel response time as the liquid crystal itself changes phase.
Nelson, most of the lag he talks about here is not LCD lag. Carmack clearly says that most LCDs take 5-20ms to update. This head mounted thing takes 70.
posted by aubilenon at 6:29 PM on May 1, 2012 [1 favorite]
gjc: This is not the issue at all. The original wording is a little misleading. The data gets to the monitor plenty quick. It just takes the (specific) monitor all day to actually show you the update, once it's gotten it.
LCDs are always slower than CRTs, but it's not like your 1 GB/s dual DVI cable has 70 megs of data in transit at a time. That's all sitting in buffers on the monitor, or it's the pixel response time as the liquid crystal itself changes phase.
Nelson, most of the lag he talks about here is not LCD lag. Carmack clearly says that most LCDs take 5-20ms to update. This head mounted thing takes 70.
posted by aubilenon at 6:29 PM on May 1, 2012 [1 favorite]
I don't think he measured that. Seems like he just assumed a "best case" scenario of 70ms travel time based on the speed of light, and found some monitors that appeared to take longer than that to update their screen based on input from a game controller.
Then why would he note that "The time to send a packet to a remote host is half the time reported by ping, which measures a round trip time"?
posted by kenko at 6:39 PM on May 1, 2012
Then why would he note that "The time to send a packet to a remote host is half the time reported by ping, which measures a round trip time"?
posted by kenko at 6:39 PM on May 1, 2012
Did we all read the same article? He blames the issue mostly on multiple buffers, sometimes which are part of the driver, sometimes which are in the monitor itself for the purposes of drawing menu overlays, etc and sometimes are part of the OS since it's multiple-buffering for whatever reason, presumably window compositing.
The mention of iOS devices is a non-sequitur as the screen is completely integrated into those devices so there's no opportunity for rogue framebuffer to stick themselves between the developer and the pixels like there is in PCs. There's no rogue framebuffers in Android or Windows Phone or Windows 8 RT devices either.
posted by GuyZero at 6:40 PM on May 1, 2012 [2 favorites]
The mention of iOS devices is a non-sequitur as the screen is completely integrated into those devices so there's no opportunity for rogue framebuffer to stick themselves between the developer and the pixels like there is in PCs. There's no rogue framebuffers in Android or Windows Phone or Windows 8 RT devices either.
posted by GuyZero at 6:40 PM on May 1, 2012 [2 favorites]
You can purchase low-latency displays -- they're just expensive as hell because it's a hard damn problem. As someone mentioned upthread, they require significantly higher bandwidth connections (to allow more data to travel per timeslice, therefore limit the buffering that causes latency). Plus there's the chemistry of physically rotating molecules fast enough.
posted by spiderskull at 6:40 PM on May 1, 2012
posted by spiderskull at 6:40 PM on May 1, 2012
and people thought I was loopy for wanting a firewire keyboard (these days, I guess I want a thunderbolt one)
posted by oonh at 6:43 PM on May 1, 2012 [1 favorite]
posted by oonh at 6:43 PM on May 1, 2012 [1 favorite]
BTW, in place of framebuffers-within-framebuffers iOS and Adnroid devices compose multiple sub-windows with the GPU, essentially making each subwindow akin to a texture which the GPU knows how to put together very fast. Or something approximately like that.
posted by GuyZero at 6:44 PM on May 1, 2012
posted by GuyZero at 6:44 PM on May 1, 2012
Circuits and packet communications are heavily optimized because there's a computer waiting for input at the receiving end, one with sub-nanosecond response capability that has run out of things to do and is eagerly awaiting input.
There's no point updating a human readable display faster than every 5-20 milliseconds (read: 50-200 fps) because: why? It's already fast enough to blur your vision. Olympic athletes need more than 100ms to respond to a starter gun. Even involuntary reflexes take tens of milliseconds. Updating every 5-20ms is already literally updating faster than you'll notice, so why improve it further?
posted by ceribus peribus at 6:49 PM on May 1, 2012 [2 favorites]
There's no point updating a human readable display faster than every 5-20 milliseconds (read: 50-200 fps) because: why? It's already fast enough to blur your vision. Olympic athletes need more than 100ms to respond to a starter gun. Even involuntary reflexes take tens of milliseconds. Updating every 5-20ms is already literally updating faster than you'll notice, so why improve it further?
posted by ceribus peribus at 6:49 PM on May 1, 2012 [2 favorites]
An enterprising individual could make a device to measure this. It would consist of a light sensor, one of those USB-connected microcontrollers like Arduino, and a program. Microcontroller reads light sensor. Sends data to PC. If PC is told "input is dark", it clears the screen to white. Otherwise, it clears the screen to black. Point light sensor at display. Use frequency counter or oscilloscope (or the microcontroller itself) to measure the time it takes for a black-to-white-to-black cycle. For a device with an analog VGA input, you could further eliminate the PC from the equation by directly generating a VGA signal that is all-white or all-black, and by subtraction measure the part of the latency that derives from the PC.
posted by jepler at 6:49 PM on May 1, 2012 [2 favorites]
posted by jepler at 6:49 PM on May 1, 2012 [2 favorites]
An enterprising individual could make a device to measure this.
There is a link to an article on Eurogamer within the answer section, which demonstrates the use of just such a device.
posted by ShutterBun at 6:51 PM on May 1, 2012 [1 favorite]
There is a link to an article on Eurogamer within the answer section, which demonstrates the use of just such a device.
posted by ShutterBun at 6:51 PM on May 1, 2012 [1 favorite]
Shneiderman [11] showed that the lag in handling user events can induce frustration, annoyance, and anger, and that it significantly affects user productivity [10], and he finds that a lag beyond 100 ms is perceptible by users.It doesn't take too many people involved in computer design to say that they can take 5-20 milliseconds for the task at hand before the sum of all the latencies hits 100ms, particularly when you care about the 99th percentile response time (delays that only arise only infrequently).
[10] B. Shneiderman. Response time and display rate in human performance with computers. ACM Comput. Surv., 16(3), 1984.
[11] B. Shneiderman. Designing the user interface: strategies for effective human-computer interaction. Addison-Wesley, 1986.
LagAlyzer: A latency profile analysis and visualization tool [PDF]
posted by jepler at 6:57 PM on May 1, 2012 [4 favorites]
and people thought I was loopy for wanting a firewire keyboard (these days, I guess I want a thunderbolt one)
Actually, I've heard that in fact the older PS/2 keyboards are faster because they're based on interrupts rather than polling, and some measurements from 2002 (via superuser) seem to bear that out, with PS/2 keyboards having response times from 2.8 ms to 10.9 ms, vs. USB keyboards' response times of 18.8 ms to 32.8 ms.
posted by Pyry at 6:59 PM on May 1, 2012 [3 favorites]
Actually, I've heard that in fact the older PS/2 keyboards are faster because they're based on interrupts rather than polling, and some measurements from 2002 (via superuser) seem to bear that out, with PS/2 keyboards having response times from 2.8 ms to 10.9 ms, vs. USB keyboards' response times of 18.8 ms to 32.8 ms.
posted by Pyry at 6:59 PM on May 1, 2012 [3 favorites]
There are high speed digital oscilloscopes specifically designed for measuring signal response times in time-critical systems. Put the incoming data feed on one lead and a light-sensitive diode input on the other feed, graph in femtosecond scale if necessary.
posted by ceribus peribus at 7:01 PM on May 1, 2012 [2 favorites]
posted by ceribus peribus at 7:01 PM on May 1, 2012 [2 favorites]
Now I have yet another convenient excuse for not playing games anymore.
"The screens are too laggy! My reflexes were trained on CRTs, MF'ers!"
posted by tmt at 7:01 PM on May 1, 2012 [1 favorite]
"The screens are too laggy! My reflexes were trained on CRTs, MF'ers!"
posted by tmt at 7:01 PM on May 1, 2012 [1 favorite]
There's an interesting blog post by someone who worked on Halo about reducing the melee attack animation times to the point that button press and effect feel like they're the same thing: Simultaneous Perception.
Optimising game code to reduce input lag seems obvious, but I'd never thought about how much delay can be introduced purely by the hardware you're using.
posted by lucidium at 7:02 PM on May 1, 2012 [3 favorites]
Optimising game code to reduce input lag seems obvious, but I'd never thought about how much delay can be introduced purely by the hardware you're using.
posted by lucidium at 7:02 PM on May 1, 2012 [3 favorites]
Temporal Consciousness > Some Relevant Empirical Findings
posted by jepler at 7:06 PM on May 1, 2012 [1 favorite]
posted by jepler at 7:06 PM on May 1, 2012 [1 favorite]
Let's hope this guy has never played an Xbox Kinect, because "laggy" doesn't even begin to describe it.
posted by ShutterBun at 7:30 PM on May 1, 2012
posted by ShutterBun at 7:30 PM on May 1, 2012
Optimising game code to reduce input lag seems obvious, but I'd never thought about how much delay can be introduced purely by the hardware you're using.
It's particularly aggravating in some cars these days - step on the gas, and some very long time later, the rpms move.
posted by Pogo_Fuzzybutt at 7:51 PM on May 1, 2012 [4 favorites]
There are high speed digital oscilloscopes specifically designed for measuring signal response times in time-critical systems. Put the incoming data feed on one lead and a light-sensitive diode input on the other feed, graph in femtosecond scale if necessary.
It looks like LeCroy's new 65GHz scope will resolve a single event at about 8 picoseconds. It probably costs enough to make a powerball jackpot winner wince, though.
Yeah, his measurement techniques were sort of back of the envelope, but they got the results he needed.
posted by underflow at 8:34 PM on May 1, 2012
It looks like LeCroy's new 65GHz scope will resolve a single event at about 8 picoseconds. It probably costs enough to make a powerball jackpot winner wince, though.
Yeah, his measurement techniques were sort of back of the envelope, but they got the results he needed.
posted by underflow at 8:34 PM on May 1, 2012
The mention of iOS devices is a non-sequitur as the screen is completely integrated into those devices so there's no opportunity for rogue framebuffer to stick themselves between the developer and the pixels like there is in PCs. There's no rogue framebuffers in Android or Windows Phone or Windows 8 RT devices either.
"Sir... Buffer 09A has gone off the grid. He's blocking all the pixels until we pay his ransom! He's gone rogue."
ROGUE FRAMEBUFFER
Summer 2013
Film not yet rated.
posted by kmz at 8:41 PM on May 1, 2012 [3 favorites]
"Sir... Buffer 09A has gone off the grid. He's blocking all the pixels until we pay his ransom! He's gone rogue."
ROGUE FRAMEBUFFER
Summer 2013
Film not yet rated.
posted by kmz at 8:41 PM on May 1, 2012 [3 favorites]
gjc: "Here is the rub- a 1080p screen is 1920x1080 pixels at 32 bits per pixel, which is 8.1 megabytes, 60 times per second. Try sending that amount of data around the world and see if it beats the LCD. "
Isn't that only ~6.5ms at 10Gbps? So maybe 70-75ms across the pond, after IP overhead, best case?
posted by wierdo at 8:46 PM on May 1, 2012
Isn't that only ~6.5ms at 10Gbps? So maybe 70-75ms across the pond, after IP overhead, best case?
posted by wierdo at 8:46 PM on May 1, 2012
Saw this earlier, it's interesting.
With touchscreens, I saw one at MWC last year from Synaptics where the display driver and touch driver were aware of one another, and input was displayed essentially before it went into deep processing like window locations and such. But I think it would be too much of a change to totally redo the way input and display are handled to be more focused on latency, since most people don't even notice it even if it's there.
posted by BlackLeotardFront at 8:50 PM on May 1, 2012
With touchscreens, I saw one at MWC last year from Synaptics where the display driver and touch driver were aware of one another, and input was displayed essentially before it went into deep processing like window locations and such. But I think it would be too much of a change to totally redo the way input and display are handled to be more focused on latency, since most people don't even notice it even if it's there.
posted by BlackLeotardFront at 8:50 PM on May 1, 2012
I love this thread. More nerding, please.
This actually reminds me of when LCDs started replacing CRTs everywhere, and playing Half Life and Counterstrike with coworkers after work. And most of my much more into games coworkers were keen to upgrade to these new LCDs, and I think I'm recalling this correctly they were advertised in the 60-70ms response time range.
And I know I remember thinking they just looked laggy and ghosty and I wanted to go back to a CRT.
Point being is that I used to totally pwn those guys in killbox or de_dust playing on a slower lab computer - but with a CRT. And I wasn't really that good. It was those shitty old laggy LCDs they were using.
posted by loquacious at 11:19 PM on May 1, 2012
This actually reminds me of when LCDs started replacing CRTs everywhere, and playing Half Life and Counterstrike with coworkers after work. And most of my much more into games coworkers were keen to upgrade to these new LCDs, and I think I'm recalling this correctly they were advertised in the 60-70ms response time range.
And I know I remember thinking they just looked laggy and ghosty and I wanted to go back to a CRT.
Point being is that I used to totally pwn those guys in killbox or de_dust playing on a slower lab computer - but with a CRT. And I wasn't really that good. It was those shitty old laggy LCDs they were using.
posted by loquacious at 11:19 PM on May 1, 2012
In other news, airplanes are not as fast as rockets, hummingbirds are not as fast as superman, and the Earth is not travelling through space as fast as gamma-ray-bursts. And Generalíssimo Francisco Franco is still dead.
Did this guy learn his 'information' chops from Dave Whiner?
posted by Twang at 12:34 AM on May 2, 2012 [1 favorite]
Did this guy learn his 'information' chops from Dave Whiner?
posted by Twang at 12:34 AM on May 2, 2012 [1 favorite]
And I know I remember thinking they just looked laggy and ghosty and I wanted to go back to a CRT.
This always drives me nuts. LCD screens seem to have got really bad lately, such that our LCD TV from 2005 has less input lag than most current models (presumably because it has almost no picture processing). When my reasonably fast LCD monitor broke the other week I was relieved to find out that Dell had just released an IPS display with 0.6ms average lag compared to CRT.
It's really nice.
posted by ArmyOfKittens at 1:02 AM on May 2, 2012 [1 favorite]
This always drives me nuts. LCD screens seem to have got really bad lately, such that our LCD TV from 2005 has less input lag than most current models (presumably because it has almost no picture processing). When my reasonably fast LCD monitor broke the other week I was relieved to find out that Dell had just released an IPS display with 0.6ms average lag compared to CRT.
It's really nice.
posted by ArmyOfKittens at 1:02 AM on May 2, 2012 [1 favorite]
He should have used the soundcard to output a 1 sample pulse when the video change started. Then he could have used a light sensitive diode taped to the screen and an oscilloscope to measure the latency. This whole smush button with finger infront of camera and then wait for the screen to change thing sounds pretty error prone to me.
posted by public at 1:25 AM on May 2, 2012 [1 favorite]
posted by public at 1:25 AM on May 2, 2012 [1 favorite]
The real WTF is describing ID Software as "one of the first PC Game companies".
posted by MartinWisse at 4:04 AM on May 2, 2012 [2 favorites]
posted by MartinWisse at 4:04 AM on May 2, 2012 [2 favorites]
I've always been too lazy to make this a FPP so I'll just drop it here: How to play classic videogame consoles on crap modern displays using a dedicated videoprocessor. Comparison of multiple de-interlacing/linedoubling/scaling devices available today and their impact on picture quality and input lag.
posted by Bangaioh at 4:37 AM on May 2, 2012 [1 favorite]
posted by Bangaioh at 4:37 AM on May 2, 2012 [1 favorite]
BlackLeotardFront, that's basically what X (the common window system on Linux desktop systems) did, at least in the time after common hardware cursor support and before everything started being drawn by compositing: As soon as input was received from the mouse (which was interrupt-driven for PS/2 and serial mice), the hardware cursor position was updated first, then other processing was done.
posted by jepler at 5:46 AM on May 2, 2012
posted by jepler at 5:46 AM on May 2, 2012
Does he actually play real games? How many times has been killed in TF2 while behind a wall because the sniper has 192ms ping?
posted by Foosnark at 6:26 AM on May 2, 2012
posted by Foosnark at 6:26 AM on May 2, 2012
He should have used the soundcard to output a 1 sample pulse when the video change started. Then he could have used a light sensitive diode taped to the screen and an oscilloscope to measure the latency.
But then you have to account for the delays in the sound subsystem and sound card.
posted by atrazine at 8:06 AM on May 2, 2012
But then you have to account for the delays in the sound subsystem and sound card.
posted by atrazine at 8:06 AM on May 2, 2012
If it's about video-game responsiveness, then you *do* want to measure from button-press to action (making the eurogamer test probably a good one). If it's about the properties of the LCD panel, then the microcontroller-and-light-sensor setup is more useful. And, yes, the audio path will definitely have its own sources of delay.
posted by jepler at 8:18 AM on May 2, 2012
posted by jepler at 8:18 AM on May 2, 2012
The point is that you can accurately measure the delays in hardware while accurately measuring the real latency of a big fleshy sausage smooshing into a button is much harder. And if we really care about that path of latency it's still going to be enormously dwarfed by the amount of time the nerve signals take to get out to your muscles, your muscles to start doing anything and your finger to then actually travel the distance required to trigger a button press.
posted by public at 1:57 PM on May 2, 2012
posted by public at 1:57 PM on May 2, 2012
All that stuff between a gesture and a pixel is also highly bound to chip timings and polling. So there is a lot of time waiting for some leading or trailing edge to clock, and then getting to the next step and do it again.
posted by clvrmnky at 2:00 PM on May 2, 2012
posted by clvrmnky at 2:00 PM on May 2, 2012
« Older "Look around the table. If you don't see a sucker... | UK parliamentary committee report declares Rupert... Newer »
This thread has been archived and is closed to new comments
posted by hellojed at 5:47 PM on May 1, 2012