Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Average adult will spend 34 years of their life staring at screens (studyfinds.org)
415 points by praveenscience on May 20, 2020 | hide | past | favorite | 296 comments


"If children get outside enough, it doesn't matter how much they study they do. They don't become myopic," said Ian Morgan, researcher at Australian National University.[0]

"Researchers say kids and teens need to get sunlight during the critical years of their development while their eyeballs are still growing.

"The mechanics of how sunlight protects their eyes are not clearly understood. One theory suggests that sunlight triggers the release of dopamine in the retina; another speculates that blue light from the sun protects from the condition.

"The solution is simple. Have kids "spend more time outside, have less demands (from) the schools and relax a bit," said Seang Mei Saw, professor of epidemiology at the National University of Singapore."

[0] https://www.cnn.com/2015/04/05/asia/myopia-east-asia/index.h...

edit: quotation marks


Meanwhile we put 5-6 year olds in kindergarten classrooms with UV-blocking windows all day, for about 9 months during the part of the year with the least sunlight available, and we've cut recess time down to almost nothing, and schools increasingly don't send kids outside if the weather's anything other than perfect. At my kids' school they don't even have indoor recess if they keep them in! They just watch a damn movie. WTF. Of course there was some stupid educational fad that led them to remove all the toys from lower-grade classrooms to make room for "learning centers" (I gather this has happened more or less state-wide), so I guess if the gym's not available they can't really do indoor recess anyway, not that it helps their eyesight either way. And this is a very highly-ranked school for our state.

1.5 hours of recess daily minimum or bust for 3rd grade and under, I say, and I think that's not quite enough, really. Screw this 30 minutes crap. Know how I can guarantee my kids have behavior problems at home? Coop them up inside all day. Know how I make a day run smoothly? Make sure they're outside running around at least a couple hours while the sun's up. Then consider that they're giving a bunch of them nearsightedness on top of definitely creating behavioral and learning problems. It's crazy.


I think it's all a side effect of the treadmill that society is putting everyone on. Between globalization and automation there are fewer good jobs that more people are competing for. Either you end up with what used to be a lot of money or way too little. Till you "made it" you are constantly at risk at slipping off the treadmill and into the "way too little" group. Especially in the US.

May this be reality or not, it's the experienced reality for many. We feel so much pressure to give our kids a head start which results in this rat race starting earlier and harder. I think this is just collateral damage from that shit going wrong. I'm so glad I was a kid in the 80s when playing outside was still normal.


We let ours play outside plenty, and there are quite a few other kids out. Does seem to be neighborhood-dependent, though, all kinda have their own culture. Our current one's less free-range than the last, which had big ol' gangs of kids with large age-ranges roaming around, and was great.

We are fairly worried about 1) eventually having some kind of encounter with child protective services—which will likely end up fine, if it happens, but will also probably be really inconvenient and stressful—and 2) injuries that end up giving us yet another opportunity to play everyone's favorite game, Medical Billing Roulette, especially now that they're riding bikes. Multiple kids, just a matter of time until one of them get a hospital trip out of a bad fall. Odds very low they'll all make it to 18 without at least one broken bone.

God damn do I wish we'd fix healthcare in the US. It's crazy how much background stress & anxiety that causes across just about all activities and choices. And that's with insurance.

[EDIT] although I'm pretty sure the "playing outside" thing tapers off around tween/teen ages in a way it didn't for us, when they all start living on social media. :-/


> Odds very low they'll all make it to 18 without at least one broken bone.

I consider broken bones to be an integral part of childhood. Well, for boys anyway.

If you make it to 18 without a broken bone, you probably should have spent more time outside.


I’m sure you’re making a general statement, but breaking a bone is pretty traumatic and the complications can be quite serious. I hope my kids never experience a broken bone.

That said, I spent my childhood outside and never broke a bone myself, but my brother did and ended up having to wear orthotics the rest of his life and routinely experiences discomfort as a result. So, my advice, don’t limit your kids, but teach them to be careful at the same time.

I’m pretty sure if one of my kids break a bone, I’m going to be a wreck.

EDIT: Your comment about boys breaking bones more than girls is ridiculous, as a parent with a daughter who loves risk taking, I can assure you it’s not a boy vs girl problem.


> EDIT: Your comment about boys breaking bones more than girls is ridiculous, as a parent with a daughter who loves risk taking, I can assure you it’s not a boy vs girl problem.

"The lifetime risk of sustaining a fracture in childhood is approximately 42%-64% in boys and 27%-40% in girls, with remarkable variation in the estimates worldwide [1-4]. While fractures more often occur in males, girls usually sustain fractures at a younger age compared to boys"

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2987399/


I haven’t broken any bones. My parents were up and coming immigrants with their own business, but not successful enough to afford health insurance (pre ACA, but not post ACA either), and there weren’t poor enough for Medicaid.

So my dad was clear to me that any injuries could cause us to have medical expenses that would derail us and my younger sibling’s future as well as the aging grandparents my parents were taking care of.

So I didn’t partake in activities that could have broken bones, but that is the hand my parents and I and others like us are dealt in the great old US of A.


And don't you think that's a horrible situation to be in?

It's completely foreign to me that there exists a first-world country where you need to bar your child from certain activities solely due to concerns over potential medical costs.

That's not a criticism of your parents - their decision was completely logical in the circumstances. It's just a fundamental flaw in the US system.


Of course it’s a bad situation. Just explaining why some kids may not have had broken bones.

If you’re not in the upper class or extremely poor class in the US, there is no safety net, so if you want any chance of clawing your way up to having the various securities of life, you have to play it safe.


I like the sentiment, but there's several factors here, including inherent "agility" and of course luck.

I never broke a bone as a kid, and not only spent half my youth outdoors but was a daredevil.

Summer: Tree climbing, as high as 50'. Constant bike riding on and off road, skateboarding, go carts, dirt-bikes (hill climbs, jumps), jungle gym, stilts, pogo sticks, trampolines, swings, water-skiing, touch football, street hockey, basketball, and soccer.

Winter: Flexible flyer sledding on steep hills with frequent collisions and wipe-outs (that was half the fun), snowmobiles, tubing, skiing, skitching (when streets are snowy/icy you hold on to a car's bumper and hitch a ride)

One of my own kids has the daredevil streak also and built huge jumps for bikes in summer and snowboards in winter. They'd jump over cars.

No broken bones (lacerations, bruises, sprains however)

On the other hand, I saw a kid break his leg at 2 mph by falling off his bike against a curb.


No broken bones here before 18, I grew up on a farm, had access to many machines (bikes, quads, tractors, cars, go karts). Spent huge amounts of time outside and liked to try more and more dangerous things partly due to be somewhat isolated, some of which would have done more than just broken a bone.


There's always exceptions to every rule. That's why I said "probably".


I guess but I feel like the schooling they get is largely bogus anyhow especially in elementary school and nothing I couldn't fix by just tutoring them a bit at home


I also dont like how people put sunscreen on the kids almost every time they go out. I live in Australia, so you absolutely do want to protect your skin, but heading to the park for an hour doesn't need sunscreen every time.

2 generations ago it was common to literally lie in the sun for hours putting coconut oil on your skin to increase tanning. And yes they did have higher skin cancer rates but there is a fair gap between that and not doing it + a long sleeve shirt/hat.

Im trying to move to a screen free Sunday to encourage more outdoors/interaction but screens are so ingrained in our lives now its not popular.


> I also dont like how people put sunscreen on the kids almost every time they go out. I live in Australia, so you absolutely do want to protect your skin, but heading to the park for an hour doesn't need sunscreen every time.

I'm making an effort to get at least 15 minutes of direct sunlight every day, but at the same time, I make sure to put sunscreen on my face/neck every time I go out.

Modern attire is the problem. Our clothes cover everything but the face, hands and neck, so when we try and increase sunlight exposure, we get huge doses concentrated on these areas.

Ideally, I'd be sunning shirtless and with shorts that cover as little as legally possible. It's a much more sensible approach to boosting Vitamin D levels without risking skin cancer on the face/hands/neck/ears. Nebulous concepts of "social acceptability" be damned.


> And yes they did have higher skin cancer rates

Much, much higher. And older people who were reasonably active and spent a lot of time outside (even if they didn't deliberately go and sunbathe) have often had one or two cancerous spots cut out.


I'm not sure what to think about that. I live in CO and people here that don't sunscreen are so much more damaged than others I know in different states. It's drastically obvious. So I sunscreen my kids like crazy. I keep thinking that I'm at least giving their adult selves the option to look 50 at age 35 if they decide to quit using sunscreen religiously. Unsure if this is the best approach. Their doctor loves it though.


I'm in Australia and have an annual skin check. The doctor advises adults to use a moisturiser with sunscreen component as a daily default, especially anyone with fair hair. My wife tends to do this.

The doctor has mentioned in the past that 60+ year old males who've worked farms or building sites their entire lives have a 35%+ strike rate on finding a cancerous mole when they come in for a check.


> Then consider that they're giving a bunch of them nearsightedness on top of

If it's any consolation, the drawbacks of myopia are fairly negligible. This is one reason there's not a lot of effort expended to prevent it.


I actually enjoy it. I’m at a point where I could get laser correction, but I’ve found that I value being able to remove my glasses and quite literally de-focus too much.


I'm sure there is a correlation, maybe even a strong one, but I have quite a bit of anecdata to suggest it is not true in all cases.

(I grew up on a farm, and spent a helluva lot of time outside, and needed glasses long before my family had a computer; my father would have spent even more time outside than I did, and likewise is pretty nearsighted.)


The link references the rate of myopia in South Korea increasing from 18% in 1955 to 96% in 2011. This suggests 2 things:

1. Myopia is not entirely genetic, because genes for myopia could not have spread to almost all Koreans from almost no Koreans in the span of 56 years.

2. Myopia is somewhat genetic, because myopia existed in a significant proportion of the pre-industrial population.


Prefixing this with an acknowledgement that I totally believe there was a change in the prevalence of myopia due to changes in environmental conditions (amount of sunlight, artificial lighting, etc etc)...

I am curious whether 18% was the diagnosed rate of myopia in South Korea in 1955 or the actual rate of myopia in 1955.

In 1955 South Korea was a poor country; now it is one of the wealthier countries (per capita) in the world. On the one hand, it is entirely believable to me that access to eye correction is radically greater than ~70 years ago, and there was a lot of undiagnosed vision problems in 1955. On the other hand, it's also entirely believable to me that literally every 20 year old (male) was given a vision test in 1955 in South Korea, because of the War. On the third hand, it's also entirely believable to me that the vision test given was fairly easy to "pass", and that a many of the people who passed it would have a level of myopia that we would now prescribe glasses for, because hey, better vision is better.


> Myopia is somewhat genetic

That conclusion is can't be supported from the above fact alone.

It could be a development disorder with a certain probability of occurrence given other non-genetic factors.

Perhaps some pre-industrial people didn't get enough sun time, because they spent too time at home reading. Others had physical eye damage from work activities, etc.


I have always been under the impression that myopia was caused by growing too fast as a child, leading to misformed eye balls. Basically children with too much food get myopia. Surely that fits the same data pretty darn well... I wish I had a study to point to but I heard/read this decades ago.


Had the inverse situation observed too. One of my friends has been writing code intensely since at least the age of 11, still spending tons of time in front of the screen. Never needed glasses in his entire life. And then there is me, who had very limited access to screen time as a kid, and I still ended up needing glasses around the age of 15 (despite none of my parents needing glasses, which goes to show that "genetic" doesn't necessarily mean something as simple as "your parents are this way, so you are bound to be the same way").

I am not a doctor, but imo it is similar to a lot of other health concerns. Predominantly genetic, but you can still affect the outcome in slight ways and improve your chances a bit.

Personal example: my grandfather who has been smoking at least a pack a day since the age of 9, and he is still doing way better in his 80s than majority of people his age. Working in the garden (even physically demanding stuff, like preparing the soil for potatoes every summer), fully mentally sound, etc. On another hand, you have plenty of young people who do everything right (no smoking, healthy diet, regular proper exercise, etc.) falling due to random health ailments. Which goes to show that while taking care of yourself is important and beneficial, the luck of hand you got can override it all in either direction at any moment.


I got another weird one to toss in, for myself:

Back when I first moved into my current place in 2013, I was able to read my stove's clock from across two rooms. My eyesight had steadily degraded since then until last year/early this year, where I needed to step into the closer room before I was able to read it.

...But then, the lockdown started. And since then, despite being on the computer for more of the day and not often going out into the sunlight, my eyesight has been improving. I can clearly read the clock from across both rooms again.

Since I don't really open my blinds, I've gotten pretty much continuous indoor levels of light for two months now, so I've my own theory: It's not specifically sunlight, but having to adjust to the differing brightness between indoor and outdoor light too often that causes issues.


I agree. This is the first time I’ve heard of it. Literally everyone in my family wears glasses, including my deceased grandfather who lived into his 90s. He grew up in rural Mexico without electricity so it absolutely couldn’t have been caused by TV or Computers


What did you spend your time doing though? Did you read a lot? Did you draw? Did you spend a lot of time working on things within arms reach? Or did you spend your time staring off into the far yonder? Isn't there a case to be made that sunlight is more of a signal used to help the eyes learn to focus and that by staring at something a foot away with low signal strength the eyes never got the feedback necessary to shape itself to the environment that it needs to work in?


Climbing trees, chasing livestock, running through fields, catching frogs, rolling down hills, digging holes... ?

I was outside. If what a small child running around on a farm for hours and hours every day, rain or shine, doesn't count as "being outside more", we need to start using a different word to refer to that.


Does anyone know if this also applies for people in their early 20s? Of course more sunlight and exercise never hurts, but I’d be more compelled if there’s scientific evidence that it stops myopia progression.



Anecdotally I had great vision (20/15) before starting college. Before I went to college the eye doctor said not to bother coming back until I got married. After 3 years of looking at a screen in the dark I needed glasses.


You know that movie, Moon, with Sam Rockwell? I started feeling like the first version of the clone right after finishing university. I just keep falling apart!


Nah, just Flux in more blue light during the day. /s


Isn't near-sightedness an adaptation? Nearsighted people can see closer things better. For example, I don't have clinical eye problems but I have some friends who are severely near-sighted, and they can read the security text on US money whereas I cannot.


Is there higher incidence of myopia in places with less sunlight?


Reminder that being outdoors and sunlight have nothing to do with myopia. It's about how much light is focusing on your fovea and in peripheral vision. Being outside is just one common way that your peripheral vision isn't constantly in focus because there's not a screen filling up most of your periperal vision.


Your statement does not make clear whether you're making a positive assertion about the mechanism of myopia, or whether you're making an assertion about some other potential eye condition.

If you are making a positive assertion about the mechanism about myopia, then you are directly contradicting the article, which states the mechanism is not clear. While it may be the case that the article is misrepresenting the science, I personally find this to be unlikely.

In which case, given you are asserting an alternate hypothesis compared to the null hypothesis presented in the article, I do ask you for some more backup to your claims.


I urge you take this seriously. It's frustrating that more people don't understand this topic, but I can't blame you for your ignorance, because you probably only learn from articles like these. Google for myopia control and start learning.


I'm looking at it entirely from an academic point of view. I personally have hyperopia, so a bit of added myopia might actually help. As distinct from presbyopia, which is different again.


So a few minutes ago I wrote code for a company, producing economic value. Last night I kept up with family, maintaining healthy personal relationships. Right now I'm engaging in intellectual discussion of major issues with other people. And last night I watched an episode of Mr. Robot, engaging my brain with a stimulating story. And yes, sometimes I just look at animal gifs on imgur.

I see no problem continuing any of this for 34 collective years.


Kind of fitting that OK Computer was released 23 years ago today. This comment reads like an alternate version of "Fitter Happier".


How much of that time was caring for your eye health so you don't go blind early?


I'm 32 and needed glasses for the first time last year, just mild nearsightedness which runs in my family, and I'm actually doing better than most (my mom and sister both needed glasses as teenager).

I eat healthy/exercise and use dark-mode themes where available to minimize glare. That's it's for eye health. I wonder if the negative association of eye health to screen time is less about the screens and more about high blood pressure brought on by being sedentary. High blood pressure absolutely destroys the eyes over time.


Is there anything to indicate that excessive screen usage can cause blindness? Especially since LCD aren't that bad concerning eye strain.


I get pretty nasty headaches and my eyes start hurting after looking at monitors with PWM backlights. Some of my screens are fine, some begin to hurt my eyes after an hour or so.


With that logic, 10% WOW players from 2009 would be half-blind, but they're not


You don't go blind from staring at a screen. Claims otherwise are the same as 5G conspiracy theories. Photons are photons.

You might get issues from focusing at the same distance all the time. Or that theory about dopamine may be true - but if anything, screens are an improvement over things like paper books. And it applies to adolescence anyway.

And nearsightedness is not anywhere near going blind.


Bruh long axial length stretches the retina and detaches the vitreous jelly faster from the back of the eye since eye volume keeps increasing (you see this as flashes of light), putting you at risk for retinal detachment, which very much leads to blindness. :v

I live in constant fear of it (though chance is smol).

And I only played WoW for like 2 years .w.


This seems quite sensationalist.


This article is about the effects of eye strain, not the overall mental health of looking at content on a screen.

With that in mind, what is it about a screen (emitted light) that is worse than outdoors reflect light? One item I can think of, is that looking at something with reflected natural light, the iris contracts based on the total amount of light hitting it. Whereas a screen may be brighter than the ambient light in the room, causing more of a point source of light to hit the retinas that would normally happen (which is why I've always found it more comfortable to watch TV with some other light on in the room vs. the room in total darkness).


Screens are optimized for energy efficiency, so their emissions are restricted to a narrow range of visible frequencies. They don’t even really cover the whole range of colors, but mix them by combining RGB, which isn’t the real thing. Eg you can make something looking like violet light by mixing blue and red, but it’s not the same thing as the „real“ violet.

Sunlight, however, is a wild mix of broadband EM emissions across basically the whole spectrum, as you‘d expect from a glowing ball of plasma (halogen bulbs are actually similar in that regard, they need and do have a UV filter). About a third of the sun’s emitted energy hits the earth’s surface as near infrared light, and then there is UV etc.

Near infrared light is very beneficial, the mitochondria in our cells can increase their energy output as a direct consequence of receiving photons in that frequency range. There are lots of studies that showed improvement of many health conditions following near infrared or red light therapy. I wouldn’t be surprised if NIR light helped prevent myopia too.


As far as your visual system is concerned, "real" violet (e.g., from a tunable laser) and the "fake" RGB violet are pretty much the same--it's coded as the relative amounts of red/green or blue/yellow almost immediately after the cones.


Not necessarily. There is a lot of analog in the eyes beyond the nerves, and we don't know all the effects of ambient light on the entire eye structure. Even if the neural impulses end up being similar (and I doubt that) the eye as a whole organ may not respond the same way. Maybe that heat energy triggers something in the cornea? We don't know.


I don't disagree that UV and IR exposure may be important. Variations in UV exposure are thought to account for changing rates of myopia, for example. There may be non-image-forming receptors with different spectral sensitivities for circadian rhythms.

However, the structure and function of circuits involved in color representations has been studied to death, and it overwhelmingly points to a tristimulus model where the activation of the S/M/L cones matters, rather than the complete power spectrum of the illuminant. The sensitivity of rods and cones has measured measured with exquisite sensitivity, both behaviorally and by directly recording their electrical activity. Many downstream neurons in visual cortex get their input from individual L/M cones (parvocellular pathway). The others (magno, konio) have a fairly simple mix of inputs from a simple, spatially organized combination of the cones. In some cases (especially within the retina), the individual fibers have been traced and mapped.

There is a hell of a lot we don't know about the brain, but the very early representation of color isn't one of them. If you've got sources saying otherwise, I'd love to see them.


(This may come off a bit harsher than I meant. It's just that I'm surprised to hear someone doubt what I thought was a well-established principle with lots of data behind it. If you do have anything suggesting otherwise, I'd legitimately love to read it.

If not, WebVision is an excellent resource with a long chapter about color. https://webvision.med.utah.edu/book/part-vii-color-vision/co...

The Visual Neurosciences is a behemoth too, if you can find a copy; I don't think it's online.


We also can’t prove the absence the magical wardrobe with an entrance to Narnia from our universe, yet many people would say you are insane if you think it’s true.


would it help to set your screens to a grayscale setting?


It appears that the lack of violet light contributes to myopia[1]. There is abundant violet light outdoors.

[1] https://www.nature.com/articles/s41598-017-09388-7


My optometrist mentioned that one big problem with being in front of a screen is that the focal depth is close and constant, and that can cause headaches and other issues. She recommended I stare out a window or make a conscious effort to look into the distance for 10 minutes every hour.


This is what I've always heard as well. Your eye muscles end up "fixed" in a small range of positions, whereas with looking at stuff off-screen (and especially outdoors) your eye muscles do a wider variety of movements.


i dont know whether this is accurate, but i've found lighter color schemes for coding muuuch more pleasing on my eyes than extremely dark/contrast-y ones


Me too, I can't stand "dark themes". I also prefer a well lit room while some people prefer working in a dark room in front of a bright display.



Part of how myopia works: When your peripheral vision is in focus too much, because something is close to your face like a screen, your eye (which grows on its own, without brain involvement), is told to grow longer to to help reduce the over sharpening of the world it thinks it has.

This happens in developing humans, at some point in adulthood it's suspected this stops (although some people see worsening myopia past the typical age range where it stops).


would it help to keep your phone or monitor further away?


Having less focus in your peripheral vision might help slow myopia progression, so possibly. However wearing glasses and contacts automatically add more focus to peripheral vision, signaling the eye to grow. So moving something further away from your face while wearing glasses or contacts may have less of an effect. Having periods of blurry vision, like looking at something far away for some time without glasses or contacts, may be beneficial in that it may signal the eye to become less football shaped to help correct for the blurriness it now is experiencing. However I haven't seen any evidence this can reverse or improve myopia, rather this just seems to slow its progression.


> Having periods of blurry vision, like looking at something far away for some time without glasses or contacts, may be beneficial in that it may signal the eye to become less football shaped to help correct for the blurriness it now is experiencing.

I literally just had that yesterday, and was wondering what it was. My eyes have always been very good, but I have been a bit worried of late after a few such blurry episodes...


no, I mean intentionally unfocusing your eyes by looking at something far away without wearing correction. Involuntary blurriness or double vision is unrelated.


Haven't we also gained 34 years in longevity in the last 100 years? So, net net we are good? :)

More seriously, what types of activities did screens replace? Was it talking to other people? Physical activities/labor? Reading? Nothing?


I know your first comment is mainly in jest, but... https://www.seniorliving.org/history/1900-2000-changes-life-...

For white folk in the US, even the life expectancy at birth (arguably an inflated metric) hasn't increased by 34 years since 1900. For all people in the US (basically everywhere else, too, I think), that ~30-40 years is dominated by infant/youth mortality getting driven hard towards 0.


13 hours a day on average?

That seems quite high to me.

Assuming you sleep 8 hours, that only leaves 16 other hours for potential screen time of which 13 we are apparently looking at a screen. Lets assume another 30 minutes a day for getting washed & dressed, bio-breaks, brushing teeth, preparing food + drink etc (30 mins seems low, but I'm being generous), so 15.5 hours left.

So of all of our waking hours, we spend 13/15.5 = 84% of every minute we are awake looking at a screen? 50 seconds of every 60 seconds staring at a screen?

Seems high to me.


I bet I'm around 12-13 hours more days than not. Maybe higher. And my phone addiction/dependence is moderate compared to some so it probably comes in a distant 3rd after laptop (easily #1) and TV (all movies, shows, and video games combined)

> Lets assume another 30 minutes a day for getting washed & dressed, bio-breaks, brushing teeth, preparing food + drink etc (30 mins seems low, but I'm being generous), so 15.5 hours left.

Laptop on the dresser/counter playing Youtube. Phones are waterproof—read news or catch up on morning messages in the shower. Laptops that aren't giant bricks and have battery life good enough that you aren't constantly hunting for an outlet and towing your power supply around, plus (even more so) smartphones, have changed everything.

Also depends on how they're counting screen time. Kinda like TV-watching stats. My parents leave their living room TV on probably 15+ hours a day. Does that mean they're "watching TV" if they're making lunch (good view of the TV from the kitchen) but half paying attention to the TV? Or I've come over and we're talking but the TV's on and you (or at least I) can't entirely ignore it?


I can easily see myself hitting 12+ hours a day. 8 hours every day at work, then an hour of TV, then a few hours on my PC at home doing whatever. Plus, using my phone probably adds up to a decent little chunk.

Weekends might drag the average down a bit though, spend much less time in front of a screen on Saturday/Sunday.


I agree. I track my screen time on my computer and mobile phone. I feel like I have too much screen time, but there's no way I could consistently hit 13 hours of screen time per day, unless maybe I put TVs around my house to use as background viewing while I do other work.

Even if I spend 10 hours at the office, it's a challenge to get over 7 hours of screen time logged unless I actively avoid all discussions, meetings, and eat lunch at my desk.


Eye strain is a serious problem and actually becoming a bottleneck for my productivity. Flux and dark-mode everything are mandatory, and I recently bought blue-light filter glasses and can't stare at a computer screen without them anymore. I had been reading ebooks on my phone out of convenience, but I stopped due to the eye strain and will revert back to using the Kindle.

If you spend the majority of your life staring at a screen, don't take your eyesight for granted. As you age, it will deteriorate, and it's really not fun, especially when your career/livelihood depends on you being in front of a screen.


I think we are all in the same boat. For me switching from reading to hearing wherever the option makes sense helped a lot.

Basic news coverage is fine in podcast form, audiobooks work fine for non fiction (going at 2x or 3x speed doesn’t kill the atmosphere) and it makes a good excuse to exercise while listening.


The medieval peasant spent how much time staring at dirt?


Few of them stared at brightly glowing dirt, though. And few of them continually stared at dirt a foot or two in front of their faces.

There's debate how much damage that difference causes, but pretending there's no difference is an unhelpful approach.


Tone down your brightness then? I use heavily customized dark themes absolutely everywhere, and my screen is hardly brighter than dirt. I have no problem looking at it for 30+ hours straight (though I don't do this anymore for other reasons.)


Hominids spent 7 million years evolving on that "dirt" they stared at.


While this article touches on eyesight, it seems like it's quite a myopic view of the issue (see what I did there? ;)

But seriously, we spend over half of our "healthy years" staring at a screen? This should be begging deep philosophical questions

- How do you want to spend your life?

- When you look back at your life, how do you want to have used your time?

- What is important to you?

Some questions I struggled with when I went on a sabbatical was "what makes me feel alive, really alive?"

Ironically, my answer for many months was, "I don't know, but it's not staring at a screen."

Many other people's answered things like "rock climbing", "sky diving", "going dancing", and I think those are pretty decent answers, but mine settled on "anything that makes me grow".

I also thoroughly spending time with other people in real human-to-human interaction. This feels like a huge slap-in-the-face wakeup call.


A big component is surely from work. In which case, how does the philosophy extend to take into account the fact that people are in offices working?


Yeah, I think this is an important question. I think some of it will be inevitable, but I don't think we focus much on "decreasing screen time"... when so many companies (facebook, google, et al) get money from people spending more time on screens (advertising). There's no incentive for corporations to spend less time looking at screens.

The few companies that do promote that -- such as REI... are outdoor companies. Which could lead you to a really pessimistic thought that even the co-op companies are really just profit motivated (I don't believe this, but there is the correlation)


We'll all be healthier after Capitalism collapses.


These stats always freak me out when you extrapolate it over the long term. The average adult will also spend 2 years commuting or an entire year of their life sitting on the toilet.

Lot's of wasted time here people. I think this also shows the power of what we can accomplish by spending 10 minutes a day doing something.


It can be quite difficult to convert random time savings into valuable time.

1) Is marginal free time actually converted to value (eg, Quality study time vs TikTok infinite scrolling)

2) Can Marginal free time be chained into valuable blocks (saving 1 minute commuting such that you're 2 more minute earl y for 30 appointments, you cannot chain that into a 1 hour study block.

Edit: I will say Ankidroid has helped me get some value out of random deadspace though.


If they sit on the toilet while staring at the screen, what bucket these years are assigned to? I.e. these extrapolations could result in multiples of your lifespan.


why would spending time on the toilet be wasted time?

actually, sometimes, doing practically nothing is very valuable for one's state of mind.


I’d avoid the toilet simply due to avoid increasing risk of hemorrhoids.


Important thing is, people don't just stare at screen. Many of them achieve smth else too.

For example, last week I've probably spent 50 hours with people front different parts of the word on screen.

Helping them turn waste plastic into 3D printing filament:

https://medium.com/endless-filament/make-your-filament-at-ho...


> last week I've probably spent 50 hours with people front different parts of the word on screen

You've spent time with a simulation of those people, their voices lossily compressed with a biased frequency response, their images projected through a non-eye lens on a 2D surface, their heads bigger and closer than you would have in real life, while rest of their bodies hidden, with hundreds of milliseconds delay between interactions. Don't get me wrong, it is a very convincing and useful simulation for many purposes, but it is a simulation nonetheless.

We might have overlooked this previously, but we will slowly be gaining an understanding of the effects of sole social interaction coming through videochat. I know for some, no matter how many zoom calls they do a day, it doesn't come close to creating the same relational satiety.


Speaking personally, most of my good friendships growing up (and even now) were with people online and they feel perfectly fulfilling and real to me. I've been really surprised to hear people (who seem to have been forced into interacting with folks online because of covid) start popping up claiming that online interactions aren't real and are somehow invalid or inferior, as your post seems to imply. I would have thought the presumably largely computer geek HN crowd would be full of folks with similar experience.


> they feel perfectly fulfilling and real to me

That is the whole point of a simulation. Hyper-palatable food, cocaine, porn etc. they also feel good and even hyper-real in our nervous systems in the short term, but that’s not a good way to judge if they pose long term complications. Though, I’m not saying video chat is necessarily a hyperstimulus, we don’t know yet.

> claiming that online interactions aren't real and are somehow wrong

I’m not saying it is wrong, and in the absence of real thing it is the rational thing to do. But thinking that it is the real thing is kidding ourselves and to the extent it replaces real life interactions, it could have long term harm. I know this is not an exact comparison, but we have already seen this with uni-directional audio-visual entertainment replacing real relationship time. We feel like our favorite youtubers, podcasters, netflix protagonists etc are our friends, or at least relationally worth investing time in (otherwise we wouldn’t consume them). Same might go with the 50 people around the world we videochat.


I've been dating someone online for the last two months and I agree there is something missing, perhaps something chemical, that can't be transmitted over a screen and a speaker. But your viewpoint seems a bit too myopic. Humans are exceptional at adapting, and our brains are brilliant at filling in the blanks. I have no doubt that people can have deep and meaningful interpersonal communication without being face to face. I know I have many times, and I have certainly experienced intense emotion with people remotely, and built trust, and felt my social needs sated (if not my physical needs, specifically sex).

Maybe reconsider the idea that it is a problematic simulation, because it's really no more a simulation that how our brains translate vibrations and light waves into sounds and pictures in the first place. If video chat is an illusion, then so is face to face communication, as neither one is an unfiltered experience. The filter of standing two feet from someone is only marginally different than the filter of a video chat. The real filter, the one doing the heavy lifting in both cases, is our brains translating the raw data of the physical world into our lived experience.


> Humans are exceptional at adapting, and our brains are brilliant at filling in the blanks

This is exactly the problem. Our adaptive machinery can adapt to the wrong stimulus and get stuck in that. This is called "reciprocal narrowing" and is the mechanism that sustains addiction. I'm not saying videochat is necessarily the wrong stimulus or has an addictive potential, but being able to adapt doesn't mean it is the right thing for us in the long term.

> felt my social needs sated (if not my physical needs, specifically sex). ... because it's really no more a simulation that how our brains translate vibrations and light waves into sounds and pictures in the first place

This sounds like a reductionistic, cartesian model of what is going on. It assumes something like "Stimulus gets in through my senses, interpreted through my consciousness in my mind and I get what I need" or "my relational needs and my physical needs are mutually separable". Cognitive science experiments show us existence of phenomena like "blindsight", in which there is stimulus processing without possibility of conscious awareness; "implicit learning", in which there is learning without conscious awareness. In other words, our conscious awareness of what is going in is not necessarily a good indicator of what is actually going on, nor if our long term needs being actually met. We don't know if we can delineate relational and physical needs (here by physical I mean physical presence, not necessarily tactile stimulus) or to what extent we can reduce relationality to vibrations and light waves going through our auditory and visual systems.

I want to make it clear, I am not saying videochat is bad, I've spent my whole week videochatting and feel like I've met certain relational needs. But it is nonetheless a simulation, and I actually can feel tired and lonely even after a day full of videochat. I don't intend to equate both, and I want to be careful not to displace the real thing with its simulation when the opportunity arises to be present in person. Just like I need to be careful about diet coke and hyperpalatable fast food not confusing the hell out of me to the point of replacing real, long-term sustainable nutrition.


If you've been reading Baudrillard, you might like the blog The Last Psychiatrist, which is a less-rigorous, more ironic take on a lot of similar ideas about hyperreality.


I read both! Thanks! It’s a shame The Last Psychiatrist doesn’t update anymore.


Agreed.


Does staring at two screens at the same time count double? At work as an iOS developer I look at two computer monitors, an iPad screen and a phone screen. When I watch TV at night I'll look at the TV and my phone. I'm sure I can get my count really high.


It might actually be better, as you keep changing focus.


Just my personal experience: i definitely notice my eyesight getting worse and my eyes getting tired when I look at screens for a while. Worst are phones , then iPad , then laptop. TV causes me the least strain. I think it may have to do with the viewing distance or the background light.

When I don’t look at a screen for a few days while on vacation my eyesight gets much better.


We spend 34 years of our lives staring at computer screens.

The rest of the time is just wasted.


Those sound like the first lines of a movie I would watch.


Drooling too? Or just staring? How about "looking at", "watching", "reading", or just "using"?


I used to spend a lot of time in book/magazine/paper reading. Now I read off screens mostly. 34 years of my life spent reading would be time well spent regardless of the medium.


Some good tips to prevent myopia:

- 30/30/30 rule: Every 30 minutes look at something 30 feet away for 30 seconds

- Take regular walks outside: it will lower your blood pressure, and expose your eyes to violet light

- ensure brightness of environment closely matches brightness of screen to prevent eye strain


So, as a child my mom always insisted on me reading with lots of lights turned on. She claimed that reading the dark causes short sightedness and etc.

Now, as an adult, I find that I have my screen's brightness turned up way more than my peers. For example, although it's currently very sunny right now in LA, I have my brightness turned up 100% on my 16" MBP. Though, in the evenings, it's more comfortable at 3 or 4 steps down from 100%.

I can't help but to wonder -- did she condition my eye to be less sensitive to light and have thus slightly ruined my low-light vision?


Why wouldn't you have your brightness at 100% if it was sunny? You need the backlight to be bright to avoid it being washed out by sunlight. It would be weird if you said you used 100% brightness in the dark, but that's perfectly reasonable when it's sunny.


Protip for everyone, since I've done a lot of research on this:

If at all possible, cut the blue from your screen at all times.

I use flu.x [0] and even during the day I set my color temp to 5800K instead of the standard 6500K. You won't notice the change much, but it will make a huge difference on how tired your eyes get.

I also go down to 1850K at night (candlelight basically) to both ease strain on my eyes and not mess up my sleep. It makes a noticeable difference.

[0] https://justgetflux.com


I set mine to 3800K for 24 hours a day. Even during the day under florescent lights and an open window, if I have to toggle it off for any reason it feels like my eyes are being stabbed with tiny knives.

I don't know about the sleep stuff (is that hard science yet?) but I personally definitely staved off eye strain and dry eye issues with f.lux and no other changes.


> I set mine to 3800K for 24 hours a day.

Sometimes when I travel to the other side of the planet but don't update my laptop clock, it goes into "night mode" midday. I can use it but I find it jarring with the natural light.

> I don't know about the sleep stuff (is that hard science yet?)

They have links to studies on the flux page [0] although I'm not sure how much of that is peer reviewed or if the sample sizes are big enough.

I know that it makes a difference for me. My sleep quality went way up when I started using flux, and it goes down when I look at screens without it too late at night (like my not-so-smart-TV).

[0] https://justgetflux.com/research.html


I've had sunset-synchronized f.lux/redshift for 7 or 8 years now. Turning on redshift during work revealed I had pavlovian conditioned myself into getting sleepy when the blue light goes away.


Hah! Never thought about that side effect. I did both at the same time so never experienced that.

Good to know.


any thoughts on blue light glasses? or setting your screens to grayscale?


> any thoughts on blue light glasses?

I tried them. They worked well enough, but I don't like to wear glasses so I stopped. Also I didn't like how they cut all blue light, including the blue from the sunlight. It's important to get blue light during the day to help your body regulate it's sleep hormone production.

> or setting your screens to grayscale?

That's orthogonal. The white part of the grayscale still has blue light in it. Grayscale is good for helping with attention issues, but not blue light issues.


why attention?


The OS likes to use color to grab your attention. Red notification dots, bright orange tabs (like on HN), etc.

By going grayscale, it limits how much attention the OS (or any other app) can grab your attention with color.

I tried it, it annoyed the heck out of me, so I went back. But it did work -- it made me less interested in using my devices.


This was already true for people I know who were watching TV all day since the 70s


Average adult will spend 34 years with a miracle portal into infinite knowledge and possibility.


Hardly. Scratch most topics past the surface level and if you're lucky the Web can find you the book you need. Usually though you'll need to find the best book the Web can find you, then use that book and/or correspondence with experts to find the ones you actually need.

The problem seems to be a mix of tons of stuff just not being on the Web yet, even decades in (some of it's in ebooks, though, yes, but a whole lot isn't) and Google having given up on "organizing the world's knowledge" or whatever their supposed mission was (now it's plainly "organize the world's advertising dollars into our bank account") so it may be there but good luck finding it.


That's an awfully pessimistic view of things. I've studied some fairly obscure corners of computer science and the Internet has allowed me to get instant access to mountains of scientific papers, a huge variety of books (many of which are out of circulation), incredibly high-quality courses on maths, DSP, algorithms and the like.

Previously I would have had no way of accessing any of that information unless I happened to be at the right university with the right department, or in some high-level research institute.


Certain fields are much better off than others. One might expect that CS would be among the best-represented on the Web.


Due to the changing nature of the web I'd even say that the sum total of useful information is actually decreasing year over year. I believe we hit an inflection point a decade or more ago when the number of people with specialized knowledge and skills who were publishing information about their domain online was surpassed by the new "content creators" whose primary goal was to optimize for clicks and ad revenue. This is unevenly applied across different areas of knowledge, the amount of information about computer and media related topics has increased but other areas seem to have realized a sharp decrease. I don't have any hard numbers to support this, only the feeling of it becoming increasingly difficult to find expert information in many areas. More and more I have to resort to public domain books as the only substantive source. There's just a general sense of something deeply wrong with the current state of the web.


Even if the content creators overwhelm the rest of the crowd now, you can still find experts and their content continues to pile up. For example, iq of internet graphics is still adding incredible content to his site 10 years later. https://iquilezles.org/. He's not the only example of this. There may be a higher noise to quality ratio, but the quality content is still growing, you just have to get better at tuning out the noise.


Discoverability is still hard when noise increases by the day, and search engines have become useless to find anything meaningful that isn't playing the SEO game.


I'm not sure about that. The topics I want to dig into I usually find it's harder to get the level of depth I want on a topic in book form than I do than to find videos on YouTube.

Some random examples:

- I'm interested in turbine engines, AgentJayZ has a while host of videos taking real turbine engines apart, talking about obscure features like compressor variable valve vanes etc.

- Plumbing, almost everything I've ever needed to know about plumbing has been available from PlumberParts / dereton33.

- Woodworking, I basically learned everything I know from watching people like Matthias Wandel.

- Recently I've been digging to electronics, I have an excellent book "Practical Electronics for Inventors" but every now and again I need to hear a different take on the same things and invariably there is a good video or article which clears it up for me.

What's more, to get the level of detail these guys show for free in book form would mean investing in some seriously expensive books, most of which hide the interesting parts under a lot of uninteresting maths.


Oh yeah, for the subset of content that is "watch an expert do a thing" the Web has become extremely good. Nothing before touches it. In part I think it's remained so because almost all the content is on one site—there's no web search involved in finding it at all, you just search Youtube (which is owned by the main company that might help you find videos on other sites). If Google'd found a way to reward putting academic & deep cultural-knowledge material on a platform they owned such that they had a near-monopoly, perhaps that site would be good. The web at-large might remain basically useless for it (or, probably, even worse, as everyone went to that single platform) such that it depends on one's perspective whether it's the Web delivering value or just Google's platform that could be served just about any damn way at this point (if they decided to transition over to a new non-Web protocol just for Youtube and put it in Chrome they might well succeed in forcing everyone else to adopt it)


What a great point. If I want to know how to do something physical, or learn how a piece of hardware works, YouTube is the place.

Since it is Google, how awesome would it be if videos also had related web searches attached to them, instead of just related videos. How-tos could have links to the original manual, the most detailed maintenance manuals, or patents and blogs detailing the physics behind how that thing works. Probably not going to happen and would only be for power users, but it could be so powerful.


Thirty years ago you couldn't scratch past the surface of more than a few dozen topics based on your local library's availability, if you were lucky to have a decent one to begin with.

Between libgen, scihub, Google Books, Project Gutenberg, torrent sites, and the rest of the internet we've got access to almost all of the world's knowledge for free with plenty of avenues to interact with experts using a variety of different channels for knowledge that hasn't been put to paper yet.


Proportionally, it might not be much better today. Maybe 1/100,000 books is exactly the piece of information you need. Certainly the rest of the books aren't total crap either, they at least passed that bar of getting the book published and making some sense. I wouldn't be surprised if the signal to noise ratio on the internet today wasn't 1/billion web pages with all the automatic SEO crap and misinformation out there today. The sites and resources you listed are popular in tech circles, but are not mainstream at all. Then again if they were, they probably wouldn't exist as we know them, if at all.


Making piracy easy has been super-helpful. Libgen doesn't have (anywhere near) everything but it's great for surveying a field and picking up some portion of the must-have resources.

> access to almost all of the world's knowledge for free

I'd guess we've got access to ~10% of it (but it's a good 10%!) for free, ~30% paid (but usually piratable!), and the remainder unavailable as a paid electronic resource but maybe (often not) as a paid paper one and maybe (often not) available pirated—maybe you can at least locate or learn of its existence with the Web, but possibly it's totally unknown to the Web outside of maybe a reference in some books that happen to be digitized (this does actually happen, though you do usually have to get a little obscure before it does, but sometimes not as obscure as one might think).

[EDIT] to be clear I think the Web is an excellent research tool, however an awful lot of its value in that role is from piracy (saving me, say, having to inter-library-loan or buy a book just to find the books that book name-drops in its preface, or to read one relevant chapter, or to see the book's index so I can find out whether I need it in the first place). I think its containing anything like "all the world's knowledge", even for liberal values of "all", is far from true, and there's a risk of thinking if something exists it's probably available as a digital resource delivered over the Web (far from true), and if it doesn't then surely it's at least possible to find out about its existence on the Web (also not true), and if neither of those are true it must be something of no value whatsoever to anyone or wildly obscure (not true).

[EDIT EDIT] then even if it is on the web, it can be really hard to find something that's not on one of a few major sites using search. More so than it used to be. It can take so damn long that pirating and reading a book on the topic on Libgen can be faster than finding the same info on the open web, even if it's there, which is pretty damning of the state of web indexing. As I wrote in another comment on this thread, it was once possible to be fairly sure when one had reached the edge of Google's knowledge and the thing you're looking for was not in its index—such certainty is now almost impossible, mostly because of how Google search (and seemingly all other web search tools) has changed, not because of increased total content.


Google, for all its current problems, _has_ made organization of the worlds knowledge vastly better than it was before they showed up.

I could be wrong but just surrounding the field of software development, there's probably already more relevant information on the internet than I could possibly consume in my life.


I've definitely felt and experienced the situation you're describing, but I think you might be painting with too broad a brush.

Sources like ArXiv, Google Scholar, and Semantic Scholar have made it really easy for me to access tons of deep, academic/professional level knowledge in my field instantaneously and for free. I know that Physics, Math, and Computer Science are relatively uniquely privileged when it comes to ArXiv, but even paywalled journals are more accessible in the internet age than they were before. I'm even able to find public-released technical info from the DoD through DTIC that I don't think I would ever know what to request without having it come up through search engines.

Likewise, tons of undergrad classes through MIT and the Ivies are available for free on Youtube; more still are available with exercises and some form of certification through services like Coursera, EdX, and Udacity. Some graduate and professional-level courses are starting to come online through these institutions as well; the Institute for Advanced Study has started releasing its lectures on Youtube over the past few years.

I can find manuals for my car, my appliances, and all my electronics online. I have access to all sorts of do-it-yourself how-tos on Youtube when I want to do work around the house. I have StackOverflow for when I have challenges with programming languages and tools that aren't clear from the docs, and the rest of the StackOverflow network when I have questions that are more specific to my problem domain.

There's more throughout all sorts of fields: instant translations with Google translate and langauges lessons through DuoLingo, Rosetta Stone, et al; music lessons on Youtube, etc.

Again, I've been through the experience where some older materials just aren't as available as I'd like, and where the normal search tools only bring you part of the way. But I think you're taking a lot for granted here.


"A screen" is not "the web".

That book will probably be in a screen too, since it's way too hard to carry it around on paper.

Even "correspondence with experts" will be on a screen.


Approximately once per week I observe a comment on HN saying approximately "search doesn't work anymore." Why is this? Why do I only observe such complaints here?


Lots of people here who were and remember being "good at search" and able to make exactly the thing they're looking for shoot to the top 3 spots on the results screen and being able to craft a series of searches such that, if they all failed, one could be pretty damn sure the information wasn't online or at least was hidden from the Google bot, neither of which are really possible anymore. The 100th time one attempts to find something on Google that one knows is there but just cannot get to show up no matter how many rare keywords one uses, one concludes Google has lost some pretty significant utility it used to have and starts to wonder what one is not finding when one doesn't know exactly the page one is trying to find.

(of course it may be better for lots of other things, but for a fairly large set of "finding things on the Web" tasks it's way worse than it used to be, to the point of being nearly useless—in part I think this is because sometime around 2008-2010 they stopped trying to fight webspammers, choosing instead to embrace some set of them provided they play by Google's rules, and downrank anything that wasn't "well-behaved" webspam or well-known sites hard)


I definitely remember a time when Google was much, much better at returning relevant results for technical content. My feeling is that there are (at least) two factors at play:

1) Google and other search engines revised their algorithms more than a decade ago. Instead of showing you results for what you searched for, they now show you what they _think you meant_ based on your search history and the search histories of millions of other users. This means searching for uncommon topics and phrases get you useless results. And you can't disable this functionality because it's baked into the core of how Google indexes and categories the content that it slurps up from the web.

2) Blogspam authors and e-commerce sites have gotten SEO down to a science, to the point where searching for nearly _anything_ not super specific no longer gets you _information_ about that particular thing, it gets you blogspam articles filled with fluff and affiliate links, or half-broken e-commerce sites trying to sell you something vaguely related to what you searched for. This is not technically Google's fault but there is a lot they could do to curb this, but all that ad revenue on those sites is how they earn that sweet sweet lucre.


> This is not technically Google's fault but there is a lot they could do to curb this, but all that ad revenue on those sites is how they earn that sweet sweet lucre.

That had been going on for years, but there'd be clear times when Google got ahead of it and results would get much better for a while, and because search was so much more precise it was possible to work around the spam. That those good times stopped happening and results are now a consistent and fairly high level of "spam-filled" by content that's seemed pretty much the same sort of crap for years, leads me to conclude they stopped trying. IIRC right around then they stopped the "no no, our ads our different and good, they're just text and always formatted the same way so it's easy to tell what they are" and became just another banner ad slinger.

[EDIT] just mined Slashdot for that last bit, looks like that happened around the last half of '07, which roughly checks out with my recollection of Google search abruptly getting much worse around '08-'09 then never getting better again.


It's not for us to judge what sort of information constitutes knowledge.


This might be field dependent. All my research is based on scientific journal articles and reviews, which are electronically available, and books are more vanity affairs


Surely you started with textbooks.


College textbooks in cognitive sciences are the epitome of "scratching the surface"


And will use that infinite potential to binge watch Friends for the 11th time.


I'll have you know this is a Star Trek: Deep Space Nine household, and we are on our 12th rewatch of the Dominion War, tyvm.


I'm going through TNG right now, and the stories still hold up very well. There are a lot more useful lessons in them than I remember when watching as a kid. I really do like the humankind aspirational nature of the show also.


I went through TNG again a while ago and I was struck by how much I'd forgotten, and how I'd kind of lost my way in terms of my beliefs. I grew up wanting to be like Picard, and watching them again reminded me of that.


I grew up on TNG and I'm now re-watching it all with my daughter. The first season is just as rough as I remember it, but I love how much of the core themes and character of TNG they got absolutely right from the very first episode.

One thing about the show that struck me is that I'm amazed at how inter-personal communication has changed over the decades. I thought I remembered everyone being a lot more easy-going when I was a kid and then decided that I must have imagined it. But no, sure enough, right there on Star Trek you see the characters poking light fun at each other not long after they first met, even in a military setting.

Today people seem to take themselves much more seriously. You have to be a lot more guarded and diplomatic, sometimes even with people you know very well, otherwise you risk alienation or embarrassment. I know TV is not reality, yadda yadda, but there's a kernel of truth in there somewhere.


It might be part of the TV era, but I imagine having people who aren't constantly on the defensive would fit into Gene's vision of a better tomorrow. If you can assume that everyone in the room is on the same team and wants the best for everyone, you probably would feel less offended by gentle jabs.


Yeah. Picard was a great leader. Humble, stern when needed, compassionate, trusting of his subordinates, etc... An amazing character.


Deep Space Nine is probably my favorite TV series of all time. I've only watched it twice though. Probably gotta step it up if I am going to hang with this crowd.


If Netflix ever adds Spin City to their catalogue I will be in trouble.


Let me tell you how much productivity I got back when they took Frasier off last year.


watching newsradio here, with rpcs3 classics on the second monitor


Newsradio for this guy


I have a folder of the entire series of Newsradio that's been played in the background (while I do other things, I don't even watch it, just listen)... sooo many times. Corner Gas is the only other show I've played as often.


only your 12... i may have problem.


I have a large collection of movies and ripped them all to Kodi. I was going though marking a bunch as watched and I started to get slightly upset at myself as I marked each one and thought "watched that one 8-10 times, watched that one 5-6 times, watched that one 3-4 times, watched that one 5-6 times, etc etc. for many many movies. And there are TV series I've watch multiple times as well. So much time. Sure I enjoyed it but counting it all up is still shocking what else that time might have been used for.


If you assume that rest time had no value, possibly, yeah, but we have fun for a reason. Productivity can't be the only thing in our lives.


Realistically the majority of that time is spent working for some company, writing mundane CRUD applications and gluing APIs together


Average Adult Will Spend 60% of Their Existence With Their Eyes Open.


If this was "Average adult will spend 34 years of their life staring at books" would that also be a bad thing? To me and my work/reaserch/fun it's the same...


Can we quantify the "net screen time" once we subtract away from activities that took up time but have since been replaced/optimized by automation?

But leaving that aside, the 34 years figure is understating the true time spent. That figure is referring to 34 years of 24-hour days. If we subtract 8 hours a day for sleep, it is actually 52 years of our waking hours.


If you are in IT, or any knowledge worker for that matter, do we even have a choice?


e-ink monitors?


Formerly the average adult spent 34 years staring at text on the printed page.


100 years ago the average adult (who dealt with information as we do today) probably spent 25 years of their life staring at paper.

It’s the content, not the medium, that matters.


"screen time" is too broad a term and is too often described as something bad. But we need to understand screens are more like papers, and what we consume off the paper is what matters.


OP: "Average adult will spend 34 years of their life staring at screens"

Comment: "And for 33.9 of those years, it will be at things that are not work related..."


I'm not sure about you, but 6-8 of those 13 hours of my day are spent working.


I'm happy to be above average at something at last.


Aaaand now I feel the sudden urge to get away from this screen and go walk through some trees. I'll see ya all later.


I'm an ape man i'm an ape ape man no i'm an ape man

https://www.youtube.com/watch?v=aRHqs8SffDo


Good thing I started out super far-sighted. If anything, my vision has improved with time focusing on close screens. :P


Better than staring in books as before. Imagine the bok giving feedback to you or allowing communication.


Staring is an embarrassing word for this person to have imposed upon the actual study.

Science journalism is a real problem.


If the average adult spend 34 years staring at screens, what would us tech workers average?!


I wonder if they count staring at "n" screens for 1 hour as "n" hours.


has anyone seen endmyopia.org? a friend of mine suggested it because he said he was able to perform “eye exercises” to improve his eyesight. He said it wasn’t perfect, but he no longer needs glasses.


I just created an account just to answer you. Yes, endmyopia.org is great. The same way we develop myopia, we can revert it naturally in 99% of the cases. I was really skeptic at the beginning, but if you participare in the forum, read a lot and follow thousands of people taking part of the process, you will belive it for sure.


I don't think that's true. Eventually, we'll have implants.


I've spent 34 years staring at screens since February.


Oh, I can totally beat that and do way more than 34 years.


Well, personally I don't sit staring at a screen, I watch movies, play games, talk to people, build stuff and learn new things. You wouldn't say reading a book is "staring at paper" would you?


Such a good point. I had a small existential crisis watching programmers on YouTube/Twitch because I was focusing on the sounds of the keyboard typing and realizing that they are sitting in front of a screen typing for hours on end and how pointless it was.

The second part of my crisis was that I was no different.

After a few months of dealing with this I remembered that actually they are engaging with an entire world, and so was I. Rhetoric is so stupid sometimes.


I find seeing video of myself using a computer downright horrifying. I look like that? A good chunk of every day? Yikes.

I mean I don't stop, because money, but I'd love to be able to take weeks- to months-long breaks from screens.


Never heard it put so plainly, that’s awesome.


Funnily enough, I remember talking to a historian who said centuries ago, reading a book was considered "mindless entertainment", much as we think about our smartphones now. Eventually, reading evolved into being considered something generally "smart" people did.


I hung out with some Vietnamese Zen buddhist monks on a monastery for a few days, and I was surprised to find that they viewed most fictional novels as mindless entertainment as well. Books with obvious educational value and religious texts were fine of course, but spending a few hours reading a novel would be lumped into the same general category as spending a few hours binging a Netflix show, so they did it sparingly.


Many were quite opposed to books and have been throughout history. This is a fun read https://www.historytoday.com/archive/reading-bad-your-health


I'd also love to read more about this, similar to the sibling comment from carierx1, who appears to be hellbanned.


> carierx1, who appears to be hellbanned.

If you read their 1 page comment history, there's dang explaining "We've banned this account." and why.


I really wish this was discussed more often. As a parent, I am constantly bombarded with people telling me I should limit my kids' "screen time."

Generational skepticism of new technology is played out. Can we move on yet?


Yeah me too. Anecdotally with my two children, they make huge educational jumps every time we increase their screen time.

Screen time for us was playing games with an educational value and watching TV with at least some educational value.

Recently we've started letting the five year old watch TV purely for entertainment, and the three year old sometimes watches too, but even then, both of them suddenly got a lot better at storytelling and coming up with their own original stories.

So even "pure entertainment" seems to have educational value for the kids.

The rule in our house is that you can't use a screen if the sun is up unless it's a weekend. That seems to be a good balance (although under the current conditions that's flexible if the screen is for an online class).


You started out I was expecting you to be very liberal with 'screen time', especially agreeing with the GP who seemed the same. But:

> The rule in our house is that you can't use a screen if the sun is up unless it's a weekend

I think that's actually limiting screentime more than most US households, you aren't actually on the "no need to limit screentime" side at all!


It's important to note that we are night owls. The kids have TV basically from 7pm (when I turn on Jeopardy regardless of sunset) until they go to bed around 11pm. And during the current situation, sometimes they go to bed around 1-3am.

So they actually get a lot more screen time than most of their peers.


Yeah, we've shifted to later as well, I'm a night owl by inclination, the boy seems to be as well, he'll crash from 3am til 12pm then get up.

Given he's happy I'm not rocking that boat.


I think they still get less "screen time" than a lot of kids I know, especially since pandemic time at home skyrocket.


Five and three year old go to bed around 1-3am? I don't necessarily have an opinion on that, but it surprises me :)


They also wake up around 11am-1pm, so they still get 10 hours of sleep. :)

Turns out not every kid loves mornings!


I know I didn't!


I think limiting screen time is to allow young kids to concentrate on tasks for longer periods of time. This is recommended by most pediatricians.

Our kids, who are home with us and in the same room as where both me and my wife work, are able to play by themselves for the most part every weekday from 9-12pm.


My partner and I were just having a similar conversation. About how we "feel bad" when our child uses their tablet a lot in a day but we're not quite sure why.

When using the tablet they are playing educational games about sorting/identifying shapes and colors, painting by numbers, solving puzzles, etc.

Or watching/listening to songs that are teaching them colors, numbers, letters, etc. And dancing around because they like the music.

These are drilling home lessons in a fun and repetitive way that we simply could not do for our child otherwise. And we've seen educational jumps from it too.

Plus, let's be real, I make my living through a screen. So why is "screen time" stigmatized as a bad thing?

I've come to the realization that screens are not an evil thing to be minimized at all costs. They are the most powerful tool of the modern age.

It's just important to get some exercise and fresh air too.


I think it's irrational. "Screen time" is today's Dungeons and Dragons and heavy metal. People are doing things differently today than I did at that age, therefore it's bad and needs to be limited.


As a millennial who spent most of the past decade behind a computer screen (and probably the previous decade before that inside of books), I mourn the time I lost that could have been spent interacting with the world.

Moreover, maybe both sides are right? Maybe you need heavy "screen interaction" to be successful in this world and maybe too much time is being consumed by screens rather than physically interacting with the world. Maybe there is a balance that isn't close to being struck by most.

Maybe I would like my future children (if I have them) to interpret the world largely through their own eyes rather than a clickbait interpretation manufactured to get more money.


I love the fact that my generation (nearly 40) and the one after me both grew up playing ultra violent games and listening to rap etc and yet the crime statistics for violent crime are lower than they've been for about a century.

Schadenfreude on that one.


Eh, I'm in camp "limit their screen time" if the screen time is all junk food. I also don't have kids, so what do I actually know? :) The only context for this opinion knowledge of the army of adults whose job it is to "maximize engagement" and give little dopamine hits.

Some apps just seem predatory.


> Some apps just seem predatory.

Oh boy are they ever. We spend a lot of time curating the kids' screen time. We try every app first.

Youtube Kids is banned. Youtube's AI sucks at vetting content and they refuse to put humans on the job.

The apps from PBS are all gold. Disney+ has so for proven to be good at making sure all the content is kid appropriate.


YouTube Kids also lacks an option to disable Autoplay, which makes it pure evil in my eyes.


Amusingly PBS kids is the only streaming app I've seen that doesn't ask "are you still there". We've left that one running overnight by accident and it just keeps going and going.


I dunno why we can't just hand select any video from youtube for youtube kids? I don't need youtube algos to recommend bullshit to my kids and I don't need youtube to control what i show to my kids.


You can create a playlist if you want to hand curate a list. But the whole reason I use streaming services is so I can pay someone else to curate the content for me.


I can't put a playlist into YouTube kids. I tried.


Here's a video on how to do it:

https://www.youtube.com/watch?v=AjKmQwhSk1s

It's not exactly a playlist, but you can hand select content.


Thanks will check it out.

Edit: I watched the video. I’ll have another look on the app tomorrow but I’m pretty sure that video does what I’m complaining about. The catalogue is pitiful. I just want access to all of YouTube. It’s not a big ask but they won’t make $$ off of Showing adult videos to kids so they won’t offer that.


It's the parents' job to teach kids how to tell junk food from healthy food; predatory apps from apps which add value; "you're the product" from "you're consuming the product". I do this with my kid, and technology is a boon for her. However, most adults aren't qualified to make the distinction between good and evil, so their kids suffer too.


> It's the parents' job to teach kids how to tell junk food from healthy food

I agree with the sentiment, but an important point is not the forget that behind the curtains there is an army of product managers, AI PhDs and tons of data running a version of Truman Show on each one of us. Usually "you're the product" is blended with "you're consuming the product that adds value". For example, you can search and land to a video to watch something educational, but opaque recommendation algorithms, un-turn-offable autoplays, nagging notifications and whatnot will try to convince you like an optimally-annoying salesman to stay just a little more and pay them in attention and ad revenue, or get you those dopamine hits so that you will want to come back to "just check" the app in a pavlovian fashion.

Whenever you or your kid interact with a screen, you are potentially interacting not only with a machinery with inherent information asymmetry but also one that we train every day exactly how much abuse we are willing to take. For further reading see Tristan Harris and the design ethics questions he brings into light.


Lots of “teaching” is literally excluding from consumption for young kids. Parents know better than their kids, why entrust that kind of advanced decision making within them?


> why entrust that kind of advanced decision making within them?

Because then there's a hope that they'll actually learn the underlying principle and make similar decisions in situations where someone else is not directly in control of their behavior.

There are kids who don't get to eat ice cream before dinner at home because that's the rule, who will happily do so when over at someone else's house without their parents around to enforce that rule. Then there are kids who actually understand why they shouldn't eat ice cream before dinner, who will decline to do so even if they have the opportunity. (That doesn't mean they'll exercise perfect judgment every time, but then, there's also no guarantee they'll follow rules that aren't being enforced.)

It's important to develop the critical thinking skills to filter out "junk food" content.

(To clarify: I'm not talking about children too young to understand the concept, I'm talking about children more than old enough to make such decisions in an informed way. Roughly speaking, think 12, not 3.)


>Lots of “teaching” is literally excluding from consumption for young kids

Sure, I agree with you. Limit what they have access to while using the screen, but not the actual screen time.

Somewhat analogous to the difference between limiting junk food from kids' diets vs. making them go full vegan and doing intermittent fasting.


Like most things moderation is key.


Maybe those people telling you to limit screen time have different observations than you. It really depends on the individual child and my wife and I have first-hand experience with both ends of the spectrum.

One of my kids thrives on screen time because she actively seeks out creative and informative non-passive activities. She is in control of her technology usage rather than the other way around. And even though we keep tabs on her, we trust her completely to manage her own time.

The other gets limited screen time because if we don't put strict limits on it, he has absolutely immense self-control issues. He throws temper tantrums, claims to be bored all the time, picks arguments with others, refuses to follow simple instructions, and won't stay in bed for anything at bed time. Put simply: he displays all the hallmarks of chemical addiction with even a moderate amount of daily screen time. All of this is greatly decreased and comes very close to being a model citizen when we reduce his screen time down to only that which is required for his school work.

And somewhere in there is the fact that human adolescent brains were not designed by evolution to sit indoors all day look at screens or books.


> Maybe those people telling you to limit screen time have different observations than you. It really depends on the individual child and my wife and I have first-hand experience with both ends of the spectrum.

But for those observations, is the problem the screen or what is on the screen? As a society, I'd like us to focus less on the former and more on the latter.

A screen is just a window that become anything, and I don't see why it should be inherently more or less educative than any other experience.

Let's say in the future we get perfect VR headsets that can perfectly replicate every sense. Would the conversation be different?


I SURE am glad my parents didn't limit my screen time as a kid. I taught myself how to program from age 14 spending countless, countless late night hours staring at the DOS computer screen programming whatever caught my fancy. My career would look so much worse if my parents had forbade me from using the computer.


careers to many of us are far from being the most important things in our abysmally short lives


I find my work to be enjoyable and am glad it supports my family and my team members who work for us. It started out as literally me working crazy sleepless hours coding freelance using the self taught skills.


Well, I agree that the issue needs careful examination. Not all screen time is created equal. That said, to an average person, screen time does not really equal educational applications, writing code or even reading. Instead, to most, it tends to mean Facebook, Netflix, Youtube, and increasingly creepier games designed to suck you in and bleed you( or your parents ) dry.

I am saying this as an expecting parent. My parents were raised without TV and there is a clear difference on how they processed the information presented to them. Heavens know I do not process it the same way what with being basically raised on internet and tv.

In short, I do not think we should move on. I think it is worthwhile to ask more nuanced questions than 'is screen time bad-discuss'.


There's a really interesting book called Digital Dementia that describes effects of modern digital tools on our brain. I'm not saying it's 100% correct, but interesting read nevertheless.


I'm with you but I think social media should be limited when they get to that point.

Some parents also think you should be playing with your child at all times and that's absurd too. It's good for kids to learn how to entertain themselves and be creative as individuals. There's not much more obnoxious than kids who grew up with their parents always playing with them or managing their attention and the kids complain about being bored.


Most kids complaining (loudly) about being bored are from my experience those that are over-saturated with TV, phones, tablets etc. that are almost always around. Once you remove this (ie trip to remote nature/vacation), they can go slightly mental for a while.

It seems to me that the recipe for that ideal self-sufficient-loving-smart-social-over-achieving kids that all parents seem to want to have is sort of a pipe dream, and it always has been, just environment changed. We all want them, but few will end up being one and its often not that much a fault of a parent.

Plenty of love, plenty of time together, some time alone, some with others, well-managed discipline seems like a good start. I would add some traveling and exposure to foreign, exotic cultures. One can't avoid screen time these days, I mean if I want to show my kid to parents during Covid, it has to be via webcam. But passive aimless consumption isn't one for me, and for sure it won't be for my kids. I'll rather put them into rock climbing course.

What happened to topics about billionaires forbidding the access to phones & TV to their kids before age of 7?


I find it's more helpful to see screen time in two buckets..

Creating/creative/thinking time spent on a screen is very different than passive consuming time.


It's much harder to articulate "limit skinner box time" or "limit time spent using services that play on gambling impulses and give instant gratification like endless youtube playlists and mobile games."


>> Generational skepticism of new technology is played out. Can we move on yet?

Is it, though? Spending lots of time in front of a TV is still considered harmful as it was half a century ago.


Both Steve Jobs and Bill Gates limited their kids screentime.


I don't think either of those people are well known for their excellence in parenting.


So did John Doe. Not sure I understand what the argument here is.


You can make it out to be whatever you want. But there is a physical difference between a screen's light emissions and the light perceived from paper. I've always been curious exactly how much different, but anyone claiming otherwise is suspect in my eyes.


Your intuition is correct. Even if they have the same apparent color, two light sources can have drastically different spectra, which is perceptible. Some info here: https://en.wikipedia.org/wiki/Color_rendering_index


Fully in agreement with your take on this. One small clarification, however.

>You wouldn't say reading a book is "staring at paper" would you?

In our day and age, you wouldn't. But this was definitely a thing in 1800s (regarding books) and in early 1900s (regarding newspapers). Using very similar excuses to rant about those, as the excuses being used today regarding screens too.

And even earlier, in Ancient Greece, tons of prolific philosophers were strongly opposed to writing and books as a concept, as they believed that writing things down and learning from that (as opposed to oral learning) was eroding abilities that relied on memory (which parallels really well with people currently complaining about search engines removing the hard need for memorizing things).

However, knowing this only makes your original point stronger.


I totally agree with you.

I also think we might want to compare. I suspect 150 years ago most people stared a shovel 34 years of their life or a clothes washing bowl etc..


This is an excellent point (sincerely), but in addition there is quite a bit of difference in what is required of a reader "staring at paper" vs. many of the couch potatoes staring at their various screens.

Probably HN is packed with better than average active users of screens than the general population.


>You wouldn't say reading a book is "staring at paper" would you?

Actually there is a colloquialism for that since the early 1900's: "Nose in a book"

To your point no one actually has their nose in the book.


Where were you when my parents kept telling me "get off that computer"? :'(


>You wouldn't say reading a book is "staring at paper" would you?

Well, ultimately, it is...


Of course, but nobody ever says this. Except when reading was a new thing.


Mostly because nobody (statistically) spends too much time reading books. And because books have (on average) more substantial content than social websites and other such uses of time, so that's a better tradeoff.

That said, some people can (and do, more so in the past) over-read, to the detriment of their social life or even to their health...


Do books do the same damage to your eyes as screens?


Is that really supported by empirical trials? A short search online seems to point to that being somewhat disputed. I'm all for contradictory evidence though if you have some I missed.

Yeah sure a screen is emitting light which maybe makes it worse for the eyes to look at in some ways and situations than a paper page, but "looking at light" is almost an oxymoron - isn't that what eyes do?


Eyes usually look at diffusely reflected light, which has a different impact than direct light.

A white reflective surface (whiteboard) in direct sunlight is 1.6cd/cm^2. Consumer monitors are ~300cd/cm^2

(I'm not sure how much bearing this actually has on eye health, this is just to illustrate how looking at a monitor and at a piece of paper is significantly different)


I think there is a unit error here. I believe that sunlight on white paper is about 1.6cd/cm^2, while consumer monitors are about 300cd/m^2 (meter not cm). Essentially, you want the brightness to match the environment.

http://www.infocomm.org/filestore/display_specs_and_human_vi...


There is indeed. Oops. I'd delete - no point polluting the Internet with more bad info - but time window has expired.

Thank you for the correction!


Your units are wrong. Monitor brightness is measured in square meters not square centimeters. A square centimeter is 10,000 times smaller than a square meter.


As noted, your units are wrong. A white reflective surface in sunlight is FAR brighter than a computer monitor.


Hm, to be honest I find reading a book is direct sunlight quite more challenging than staring at a screen, because the reflected light is quite unbearable.


>A white reflective surface (whiteboard) in direct sunlight is 1.6cd/cm^2. Consumer monitors are ~300cd/cm^2

Why does it hurt my eyes to look outside after a while looking at a screen in a well lit room? (the sun is not directly visible, just buildings, trees and sky)


Because the sun is an enormous sphere of hydrogen-helium plasma with a core that's undergoing fusion just due to gravity. Even indirect sunlight is still a very large amount of light and energy.

I don't mean to sound flippant, but I think sometimes people forget just how insanely energetic stars of all sizes are.


But GP's point was that a screen is much brighter than a sunlit surface. You can't both be right, or I'm missing something.


cd/m^2, not cd/cm^2

I don't understand. Monitors are intended to be set to the same brightness as the ambient environment. 300cd/cm^2 is max, not average.

Of course shining a monitor in your face at maximum brightness is bad, just like staring at a light bulb all day is bad.


You've got your units wrong. Very obviously so, to anyone who has ever taken a laptop outside in daytime.


Hey, do you have specific examples of the damage you're concerned about?

I've been reading up about this a lot as it is something that concerns me, but so far most of what I've says that longterm damage is mitigated by taking regular eye breaks (ie, starting at something 20+ feet away every 20 minutes for 20 seconds) & getting enough sleep (giving the eyes time to properly rest). Are these mitigation strategies insufficient?


Does that matter? Running damages your knees. Playing tennis damages your elbow. Throwing a ball damages your shoulder. I do the thing I love while staring at the screen. If it damages my eyes that's the price I pay. I'm not going to stop doing it.


Can you provide evidence of where screens cause damage to eyes? My ophthalmologist shared with me that there is general irritation but we haven't been able to prove that screen times degrade eye sight over time.


Staring at anything too close to your eyes for a long period of time isn't good for them. If you are taking appropriate breaks, my understanding is that screen time is harmless.


I've got the same understanding - I asked my ophthalmologist about it as well because my vision is already quite bad and I'm a computer engineer.

For breaks, she recommended the "20/20/20 rule" - focus your eyes on something at least 20 feet away, for 20 seconds, every 20 minutes of screen usage.


Yes, I've gotten that recommendation as well. Apparently more natural light exposure may also decrease the risk for nearsightedness. Having a window near my computer monitor makes it easy to remember to look outside whenever I'm not actively looking at the screen.


I think there is a nuance in reducing screen time versus increasing natural light exposure.


Probably? It's not like screens shoot death rays. It seems very likely that looking closely at patterns within a rectangle has the same effects whether it's made of paper or electronics.


CRTs used to emit X-rays, so they actually did shoot death rays.


Any ray is a death ray given sufficient dosage.


And that was the day the Care Bears let their stares full power be known.


They are backlit though, that's a big difference. Not sure if any issues would be caused by this.


Is it a big difference? If you take the reverse of ray tracing, why would a light beam of given intensity and wavelength act differently on my retina because it was directly transmitted vs reflected?


Backlit screens are brighter than a book under typical indoor lighting conditions. (Although this doesn't really support the original point, because the canonical alternative of going outside is much brighter than anything you do indoors.)


Depends which game you're playing.


If you believe my parents then all the years of reading books at night with almost no ambient light would have destroyed my eyes.


No idea, but I'm not sure it's material to whether "staring at books" is a good description of reading.


probably yes.

At least with [1] tpart of the problem is (very simplified) focusing too long on objects too near (book, screen, ...)

[1] https://en.wikipedia.org/wiki/Near-sightedness


There's always been a strong correlation between near-sightedness and reading books or looking at screens in childhood and early adulthood, and it was assumed for a long time that it was because of what you described (focusing on objects too near), but the latest research suggests something much more interesting is happening.

Children who spend a couple hours per day outside have extremely low rates of near-sightedness, so they think there's something about the eyes being exposed to direct sunlight that's necessary for them to grow properly.


Screens haven't damaged your eyes for a long time. That was only true on the old CRT displays, which literally shot high frequency EM at your eyes.


Yes, it does, especially when reading in poor lighting and poor posture.

It turns out the best way to not damage an organ is to not use it...


Do screens do damage to your eyes? The article lists some side effects, but I wouldn't consider that any sort of eye damage, and it's certainly not permanent.


do screen damage your eyes?


Ugh, I did not need to see this today :(


Ironic


Collecting info, react to info, organise info, communicate info etc.

Otherwise might as well say we look for our life. It is not staring at. It is something else.


speak for yourself, some of us are definitely above average

>_>


phd's will spend 5 years starring at paper


And programmers?


I read this, and wanted to put my phone down. I failed.


Pfft. Amateurs.


time well spent, imo


That fucking sucks.


seems low


Average adult will spend 31 years of their life lying on the bed

If we want to be pessimistic and reductionist there’s many other ways to do this too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: