Hacker Newsnew | past | comments | ask | show | jobs | submit | nye2k's commentslogin

I worked for a decade in what I would consider the highest level of our kids' privacy ever designed, at PBS KIDS. This was coming off a startup that attempted to do the same for grownups, but failed because of dirty money.

Every security attempt becomes a facade or veil in time, unless it's nothing. Capture nothing, keep nothing, say nothing. Kids are smart AF and will outlearn you faster than you can think. Don't even try to capture PII ever. Watch the waves and follow their flow, make things for them to learn from but be extremely careful how you let the grownups in, and do it in pairs, never alone.


I love this absolute example of old systems interfering with new systems, rewriting old systems.

My old man started his tech work on hot rods, then mechanical typewriters, calculators, eventually continuing into mainframe electronics and nearly followed all the transitions up to today’s AI.

The number of times I’ve scratched my head at a problem and he had a clear understanding of where the logic broke… based on a historical decision that could not physically be undone.


If you try profiling almost any program that does linear algebra (something that uses Numpy, for instance), you will see a lot of calls and CPU time in functions with names like DGETRF or SGESVX. These obscure names stand for stuff like Single-precision GEneral matrix Solve Vector eXtended; i.e., solve a linear system of equations with a full, dense matrix. Why are they so difficult to parse? Couldn't they come up with a friendlier name?

They come from Lapack, the standard linear algebra foundation library, which is written in Fortran 77. That library was first written in 1992, when the Fortran 90 standard was still new and not supported everywhere, so they stuck with the earlier version. Lapack has become the standard library for dense non-parallel linear algebra; it is still maintained and updated, but the basic math algorithms haven't changed much, so there was no need to replace it entirely. Today there are also processor-specific libraries like MKL or Apple Accelerate, but they still all follow the same Lapack API.

When Fortran-77 was standardized, they decided to keep function names at most 6 letter long, to "ensure portability". I.e., they wanted to support certain compilers and architectures that were already considered old in 1977.

TL;DR: if you can't read easily those flame graphs today, it's because of backward compatibility with certain mainframes that probably date back to the late 1960s.


In particular, 6-letter long function names may have been convenient on mainframes that used 6-bit alphanumerics in 36-bit words, the 36-bits having been backward compatible with 10-decimal-digit electromechanical calculators.

https://en.wikipedia.org/wiki/36-bit_computing#History

EDIT: I had thought 10 digits of precision were required for certain calculations, but the WP article points out that they may have just corresponded to the operators having had 10 digits on 2 hands, in which case we're being backwards compatible with Hox genes, specifically Hoxd, and tetrapod pentadactyly is backwards compatible to hundreds of millions of years:

https://www.popsci.com/science/why-five-fingers-toes/


Had more to do with punch cards and flexowriter tapes and octal, which predates large word sizes or even mainframes. Note the following from the MIDAS macro assembler [0]

Fortran predates this and was a different lineage than IBM, but not how six char symbols were a request

> The MACRO language had been used on the TX-0 for some three years previous to the writing of MIDAS. Hence, MIDAS incorporates most, of the features which have been requested by users of MACRO, such as more flexible macro Instructions, six character symbols and relocation.

Note that when porting b to the pdp-11, which was ascii vs the earlier FIODEC/flexowriter 6 bit paper tapes is why c case statements fall through, they used it to allow lower case commands in ed as an example.

Flexowriters are 1940s iirc, and TX-0 through the early pdps were octal so it makes sense to grow in multiples of the 3.3 bit lines of paper tape

[0] http://bitsavers.org/pdf/mit/rle_pdp1/memos/PDP-1_MIDAS.pdf


Also note you can count to 12 on one hand and 60 with the other. That is why the ancient Sumerians used it. Base 10 was added to Roman abacus but they still kept the uncia (12) for some functions.

IIRC that wasn’t droop until the renaissance when they read Archimedes attempt to calculate the number of grains of sand needed to fill the universe with grains of sand, he used decimal and they asserted it was superior.

So you can consider decimal as tech debt:)


At my first job circa 1990, our codebase was constrained to 6-character function names in the core libraries, which had to run on many platforms including mainframes. If I recall correctly, you could have longer names, but only the first 6 characters were significant to the linker.

Never thought about why that might be other than "yeah, memory is expensive".


Wasn't there a very similar library earlier than 1992? I seem to recall Linpack back in the early 1980s that sounded very similar.


That is correct, I did not mention Linpack. It had different function names than Lapack though (while the naming scheme was similar, and still constrained to 6 letters); for instance DGETRF was named DGEFA in Linpack. [1]

[1]: https://netlib.org/linpack/dgefa.f


Yes. Lapack was the successor to linpack and I seem to recall some of the linpack routines going back much further than the eighties. MATLAB (which existed before the commercial release in 1984) was built on linpack.


Cue obligatory reference to the programmer archaeologists in Vernor Vinge's novel A Deepness in the Sky. Their job, on starships, is to safely bodge the multiple strata of software that have accreted since Mankind left Earth, centuries before.


I'm pretty sure we've achieved that already, centuries ahead of schedule :-)


The answer to any question of the form "why is something the way it is?" is always "historical reasons".


Have seen this time and time again during my career.

Most of the time, it's something you could never conceivably figure out without having been there at the time. But after 10 seconds on the phone or a brief email from someone who was, it makes complete sense.


I have been developing a game with this process, specifically for portability, reach and distribution across multiple game engines and platforms.

I find CUX to be very intuitive for prototyping. But my game is Language and HCI at heart, logic that allows the development process to go smoothly. It is certainly not for everyone or every project.


I want to add in, as I used a ton of JS back when for a GUI that would build prepress ready PDFs and ship em direct to giant xerox printers for a company called Copy General - the early days of on demand printing.

The pdf format was awesome broad shift for the early digital printers and has been a nice standard for a long time.

Adobe uses Acrobat as leverage in this game. Reader is the public’s only peephole and they have famously kept the features lean.


The problem to solve is that these clubs exist already with low membership. Join a local Masonic lodge, or other local social org if you want to meet men you will learn from.

Boy Scouts, DeMolay and other boys youth orgs got their start in the 1910’s, joined by young men without fathers who were lonely due to their life situations. Many just need someone to take their hand and show them how to break the ice.


I joined the Lions about three years ago. Our local chapter was defunct, so I got it going again.

Now we have about 40 members (for a town of 500, that's pretty damn good).

We get together to do volunteer and fund raising events. But mostly we meet twice a month to eat supper and bullshit/play cards/pool/darts.

It's awesome.

I think we're going to bring back the formal dances they used to have in the 40's-60's. I think that would be fun.


Masons and the Scottish Rite (and even the Boy Scouts and DeMolay to some extent) have religious components that exclude people like me.


All Masonic and similar type clubs in my area openly accept women.


We’ve been using Lottie for years now for certain PBS KIDS brand animations and it has multiple benefits over other formats. As with any runtime rendering in a 2D plane, it takes performance hits at scale. Lottie implements into all our pipelines and workflows nicely; game, app, video. We run them as idle bg animations on the home layer across many platforms - and then deliver static experience for devices that don’t support them, like Roku.

After Effects is a beast, and with this workflow a single person can animate a loop that we can then export the Lottie/Bodymovin json, Mov for Broadcast & YouTube, and simplify into an SVG for low end users.

Not to mention it has all been a great stop gap after Flash.

Now we use Rive too, and can import those json animations into new workflows. I have personally worked with several core folks in this animation space including Hernan, Mat Groves of Pixi, Matt Karl of CloudKid, all whom tackled these late Flash transitions, with plugins, new export formats and math.

I have learned that all of these efforts have their place, and they all have their own FORMATS which are often incompatible with each other because of the way major softwares organize animation over a timeline.

Choose your battles, pick the right tool for the project.


Love the pragmatic take. It's easy to get caught up in tool vs. tool debates, but the reality is that every format/workflow has its sweet spot.


The explore/exploit tradeoff immediately seems like a natural connection to some synaptic pruning and forking research that has been popping up around neurodiverse brain patterns.

Is the individual following a typical known neural path to find the berries because they have already learned it to be successful?

Or are they instead connecting new neural pathways along the way and following atypical unanswered thought threads in an attempt to find a new and better option... for some motivated reason?

Seems more useful to study over a longer term, across many similar tasks.


I could not find the remote to my TV this morning and attempted to use my iPhone instead. By the time I arrived at the correct UI my 3 yr old had already found a PS4 controller and was able to control the TV and navigate to where he wanted to go... I only needed to notch up the volume.

Assistive devices are necessary for a large audience, as they allow users to leverage their strengths. Just as my 3yr old beat my phone speed with a game controller, users will be able to type faster than me with this keyboard.

It is nice to have a single device that tries to do it all, but interacting with flat UI buttons in a 2D plane of light and glass is limited to a very small set of sensory inputs and therefore cumbersome for anyone to use. There is physically no way around this HCI problem without adding additional hardware. Thanks for working to bridge the gap!


I remember my excitement when CSS3 became reality in modern browsers. I was following this project closely, which attempted to use recreate Robert Bringhurst’s The Elements of Typographic Style in CSS. That project made it clear just how far we were from a legible, printable web.

http://webtypography.net/


This was my immediate thought also - as I embrace my ADHD as success, not failure. After 10 years of web design/dev work I found myself spending more time in SQL than design and left to study classical animation. At 20 years I have managed to cobble together a useful, successful, maybe even desirable wildcard career in edu kids media, collecting accolades along the way that I couldn't care less about.

I like to think of my ADHD as the superpower of subconscious thought. When I wrangle the focus, things percolate quickly and I will create very interesting work at an incredibly high production value. This happens both alone and with a team and I believe it to be related to a wide and varied skillset--master of none.

Success happens so frequently that I have been able to learn some conditions to gain focus so results are fairly repeatable. My work gets a lot of eyeballs, folks see this value and will put me on new projects or simply come to me for validation of their ideas.

Still, I'm not the easiest person for neuro-typicals to work around, and after 20 years that is unlikely to change much. I keep my job because I'm always needed - I can always do the thing that needs to be done, or help a team to deliver. It helps that I'm also kind, and fun to be around. But, as projects mature I am eventually phased out for a larger team of stable redundancy and I have to cope with losing the thing that I built and love.

My joy comes with learning something very new and very challenging, casting light on the unknown by diving head-first before others think. My career is successful because I am skilled and able to take on the risks that others are afraid to spend the time or resources on. I am somehow already prepared, interested, and on staff.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: