It makes sense you'd be able to get a much higher refresh rate on a tube if you reduce the resolution, since you would be reducing the beam's travel.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Changing the resolution on a CRT normally doesn’t make the picture smaller. There is no native resolution, phosphors are not pixels. My Viewsonic would display 640x480 or 1600x1200 on the whole 21” regardless. You can also watch the video, it’s not using a smaller area.
I believe the limitation is bandwidth, not the electron beam.
There is a limit on the spacing of the colour bands though. If you want colours then you have to hit the spots where the correct phosphors are and this limits the usable resolution.
What do you mean? The shadow mask ensures the gun for each colour can only hit the phosphors of that colour. How would a lower resolution change that?
Yeah I didn't think it would make the "pixels" smaller, but the beam would need to pulse less often and therefore could travel more. Maybe I'm misunderstanding what they did.
Electron beams scan insanely fast, that isn’t the limiting factor. Getting that much bandwidth across a VGA cable is tough. If you wanted super high refresh rates on old CRTs you’d have to drop the resolution. Same concept.
Ah. I see, so reducing the resolution was more about sending frames to the monitor faster, not about optimizing the tube hardware's behaviour
Yeah basically you can only signal "on-off" so many times a second in a vga cable before the ons and offs get blurry and unusable. So you can trade lower resolution for a higher frame rate as long as you keep the total number of on-offs below the limits.
320 x 120 resolution
I am embarrassed for having read that
The best games!
"Ancient". I had one up until 2008.
Edit: I guess when you think about it, someone born then would be entering high school now. Fuck I'm old. It's weird because all the 2000s just blend together for me, there is nothing defining the decades anymore, like 80s = cocaine and big hair, and 90s = neon and "radicool" stuff.
I loved my 1600x1200 Viewsonic, used it till 2010 or so. The flicker wasn't ideal, but man the colors were so much more vibrant than shitty LCD screens of the 2000s were capable of. These days, I think Apple's fancy LCDs with HDR win on all fronts, but it took a while to get here.
iPhones were released in 2007, so in 2008 smartphones hadn't really taken off yet. We old.
I need newly made modern CRTs. The salvage ones I have tested all have degraded image, and none go that fast even with micro resolution
Fun overclocking project. I wonder how far can you go with the fastest LCD monitors
Dell is already at 500Hz 1080p
Is it really worth the cost after 144 Hz, though? Are there applications for a higher refresh rate than the human eye can even see?
This guy is pretty exited about it: https://www.youtube.com/watch?v=nqa7QVwfu7s
He says it looks "real"
higher refresh rate than the human eye can even see
There is no fixed limit on refresh rate that we can see, that's not how seeing works.
Thanks for the answer!
I know there’s no numbered fixed limit on the human eye, obviously, but it seems like beyond a certain screen refresh rate our eyes wouldn’t really notice a difference, yeah?
Your rods and cones in your eye and the nerves that transmit the information to your brain have signalling limits, they can only fire so fast and they have a time to reset. It depends on lighting and what you're focused on as well.
Which is why film can get away with 24 frames per second because in a dark theatre and a bright screen 24 fps is enough to blur that signalling so that it looks like decent motion. Only thing cinematographers had to watch out for is large panning shots as our peripheral vision is tuned for more rapid response and we can see the juddering out of the corner of our eyes.
I could see the 60Hz flicker of crt monitors back in the day if I had a larger monitor or was working next to someone with 60Hz. Not when I was directly looking at it, but when it was in my peripheral vision. The relatively tiny jump to 72Hz made things so much nicer for me.
For Joe Everyman with a reaction time of 250-300ms it would probably not be worth the additional cost, but for esports players who have a reaction time of half that already it starts to matter more, especially for games that run synchronously on a tick system.
Ah, so like the esports competitions for LoL and the like? That makes more sense. Thanks!
Was it worth it tho?