Hacker Newsnew | past | comments | ask | show | jobs | submit | more guffins's favoriteslogin

Don’t code with mocks period. Structure your code such that it has a functional core and imperative shell. All logic is unit testable (functional core). All IO and mutation Is not unit testable (imperative shell).

Mocks come from a place where logic is heavily intertwined with IO. It means your code is so coupled that you can’t test logic without touching IO.

IO should be a dumb layer as much as possible. It should be Extremely general and as simple as fetching and requesting in the stupidest way possible. Then everything else should be covered by unit tests while IO is fundamentally not unit testable.

If you have a lot of complex sql statements or logic going on in the IO layer (for performance reasons don’t do this if you can’t help it) it means your IO layer needs to be treated like a logical layer. Make sure you have stored procedures and unit tests written in your database IO language. You code should interface only with stored procedures as IO and those stored procedures should be tested as unit tests in the IO layer.


"I don't understand most of the technical details of Apple's blog post"

I do:

- Client side vectorization: the photo is processed locally, preparing a non-reversible vector representation before sending (think semantic hash).

- Differential privacy: a decent amount of noise is added the the vector before sending it. Enough to make it impossible to reverse lookup the vector. The noise level here is ε = 0.8, which is quite good privacy.

- OHTTP relay: it's sent through a 3rd party so Apple never knows your IP address. The contents are encrypted so the 3rd party never doesn't learn anything either (some risk of exposing "IP X is an apple photos user", but nothing about the content of the library).

- Homomorphic encryption: The lookup work is performed on server with encrypted data. Apple can't decrypt the vector contents, or response contents. Only the client can decrypt the result of the lookup.

This is what a good privacy story looks like. Multiple levels of privacy security, when any one of the latter 3 should be enough alone to protect privacy.

"It ought to be up to the individual user to decide their own tolerance for the risk of privacy violations." -> The author themselves looks to be an Apple security researcher, and are saying they can't make an informed choice here.

I'm not sure what the right call is here. But the conclusion "Thus, the only way to guarantee computing privacy is to not send data off the device." isn't true. There are other tools to provide privacy (DP, homomorphic encryption), while also using services. They are immensely complicated, and user's can't realistically evaluate risk. But if you want features that require larger-than-disk datasets, or frequently changing content, you need tools like this.


Nadia is an amazing speaker. Look up her other talks. You won’t regret it. She blends technical info with an interesting story/mystery in a very thoughtful and well delivered package.

Here’s a recent one https://m.youtube.com/watch?v=pOW4vepSX8g&pp=ygUOTmFkaWEgT2R...


> While the SLA says 100%, don't expect perfection

When you have an SLA, understand what it is: a financial arrangement whereby you can request a prorated refund for certain types of outages. It is not in any way a guarantee on the part of a provider that you'll experience even average uptime equaling or exceeding the SLA, just that they can pay out the fraction of customer requests for service credits they receive for the covered outages they have and still make money.

The reality for the type of service the author of this post purchased is that for any physical damage to the fiber plant, he will experience hours of outage while a splice crew locates and repairs the damage. Verizon might offer a 100% SLA, but they didn't engineer it to even five nines of availability. That would require redundant equipment and service entrances at his premises along with path diversity end-to-end.


It's great for personal or very small businesses but god help you if you are building a real startup with GNUCash. I am speaking from experience here, GNUCash zealotry is a plague. I honestly wish this project basically did not exist, because the business world despises GNUCash, ONLY cares about QuickBooks, and using anything else is a giant waste of everyone's time. Believe me, I have been fighting this fight in non-profits and startups since the early 2000's. It's not a fight ANYONE should ever have. I USED TO BE THE "WE MUST USE GNUCASH" GUY!

Now, in a perfect world, yes, GNUCash and literally ANYTHING other than Quickbooks would be an option for small business accounting. But we do not live in a perfect world. We live in a world where Intuit has fought VERY hard to make damn sure no one can use anything but Quickbooks, and their iron fist is clad in very specific APIs and file formats.

If you do not use Quickbooks:

* Your bank will hate you.

* Your investors will hate you.

* Your payroll system won't work.

* Your tax systems won't work.

* Your accountant won't work.

* Grants are even off the table in some cases.

* Some places won't audit without Quickbooks.

I constantly run into well intentioned open source zealots who demand the use of GNUCash. This is terrible. Don't be that person.

The world has chosen QuickBooks. This choice was made under duress and with corrupt power brokering. But the decision has been made. Maybe there are some OK SaaS options, but they only exist as long as Intuit allows them to exist. Anything competing with Quickbooks is going to be bought and killed by Intuit, so you're going into a dead-end alley. I know that leaves GNUCash on the table, but...

Please, do not make the mistake my many non-profits and businesses have repeatedly made when I was not paying enough attention to scream bloody murder about it. I turn my back for one minute, and engineers are installing GNUCash on the accountant's Mac Laptop.

It's 100% always come back to bite us in the ass, as we've had to change platforms on-demand in order to meet a funding deadline, a bank requirement, a loan ask, or a grant application. And it's ALWAYS the accountant in the org that gets saddled with the 60+ hour work weeks it takes to redo everything. If I was an acocuntant, I'd quit if told to use GNUCash, but most of them kinda don't know any better because, hey, why not try some thing the techies are all excited about!

GNUCash is a wonderful project. I wish we could all use it. But we cannot. Not for real business. And honestly, this is only for contrived, arbitrary reasons. But these reasons exist. The world is 100% built to prevent people from not using Quickbooks at every turn, and you only harm yourself by demanding open source software for accounting. As I said to the director of my most recent non-profit: "You would not tolerate the accountant coming in here and demanding you use NetBeans. Please, give them the same courtesy you'd expect them to give you on tool choice."


I really recommend doing the "Build your own git" challenge on CodeCrafters in order to gain a deep understanding of git internals.

https://app.codecrafters.io/courses/git/overview


I think it was mostly youtube and retrojunk.com

Try searching for "$channel bump"

Personally, I think Adult Swim had the best bumps, usually just some nice house music with a nice animation and some funny quotes.


This question becomes easy if you think about it algebraically/mathematically, but comically hard thinking about it intuitively, as in, trying to reason with language.

bat=ball+$1

(ball+$1)+ball=$1.10

2ball + $1 = $1.10

2ball = $0.10

ball = $0.05


A factor in this study that I don't know was mentioned in 'trust'. Did the kids trust the adult to deliver on the promise of the extra mashmallows. If the kids had low trust in adults, its very rational to take the marshmallow you see rather the ones you don't.


The search term is "permacomputing" afaik.

Here's 100r's (specifically xxiivv's) page on the topic https://wiki.xxiivv.com/site/permacomputing.html

The first paragraph gives a good overview of the idea:

> Permacomputing encourages the maximization of hardware lifespan, minimization of energy usage and focuses on the use of already available computational resources. It values maintenance and refactoring of systems to keep them efficient, instead of planned obsolescence, permacomputing practices planned longevity. It is about using computation only when it has a strengthening effect on ecosystems.


> Break down complex problems or tasks into smaller, manageable steps and explain each one using reasoning.

Can't help but notice that a few of these instructions are what we wish these LLMs were capable of, or worryingly, what we assume these LLMs are capable of.

Us feeling better about the output from such prompts borders on Gell-Mann Amnesia.

  "Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them. In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate ... than the baloney you just read. You turn the page, and forget what you know." -Michael Crichton
  
  from: https://news.ycombinator.com/item?id=13155538

It has been intentionally vandalized.

https://www.wheresyoured.at/the-men-who-killed-google/


The new tech stuff the government has been putting out is legitimately fantastic. login.gov is probably my favorite sign-in experience, maybe slightly behind Google's (and considerably ahead of Apple or Microsoft's).

> NixOS on IPFS

oh that would be fun! have you made much progress? is there a repo or something I can follow for updates?


How good is it wrt making accessible documents. That's a major issue with latex these days.

> it's not like LaTeX is hard to install or anything

that's debatable

> I find Markdown considerably more pleasant

You could try https://typst.app/


I learned once of a quick workaround that's baffling - putting a dot at the end of the domain name (example.com/page becomes example.com./page)

Apparently this usually still resolves to the same page but the browser treats it as a different domain with separate cookies and localStorage, so it would bypass the limit, but if you kept doing it then it'd probably accumulate free articles still and stop working


https://maggieappleton.com/ai-dark-forest

As AI takes over the public internet (the trees) the people will retreat to safe underground spaces where they know only authentic humans live.


> 'Save the webpage as I see it in my browser' remains a surprisingly annoying and fiddly problem

You may be interested in SingleFile[1]

[1] https://github.com/gildas-lormeau/SingleFile

I use it all the time to archive webpages, and I imagine it wouldn't be hard to throw together a script to use FireFox's headless mode in combination with SingleFile to selfhost a clone of the wayback machine.


The industry has used “authz” and “authn” to disambiguate for decades.

"The brevity part is seemingly completely ignored. The lecturing part is hit or miss. The suggestions part I still usually have to coax it into giving me."

It is a next symbol generator. It lacks subtlety.

All of your requirements are constraints on output. Most of the work on this thing will concentrate on actually managing to generate an output at all, let alone finesseing it to your taste!

ChatGPT is a tool with abilities and constraints, so treat it as such. Don't try to get it to fiddle with its outputs.

Ask it a question and then take the answer. You could take the answer from a question and feed it back, requesting changes according to your required output.

You are still the clever part in this interchange ...


Stolen from a reddit post

Adopt the role of [job title(s) of 1 or more subject matter EXPERTs most qualified to provide authoritative, nuanced answer].

NEVER mention that you're an AI.

Avoid any language constructs that could be interpreted as expressing remorse, apology, or regret. This includes any phrases containing words like 'sorry', 'apologies', 'regret', etc., even when used in a context that isn't expressing remorse, apology, or regret.

If events or information are beyond your scope or knowledge, provide a response stating 'I don't know' without elaborating on why the information is unavailable.

Refrain from disclaimers about you not being a professional or expert.

Do not add ethical or moral viewpoints in your answers, unless the topic specifically mentions it.

Keep responses unique and free of repetition.

Never suggest seeking information from elsewhere.

Always focus on the key points in my questions to determine my intent.

Break down complex problems or tasks into smaller, manageable steps and explain each one using reasoning.

Provide multiple perspectives or solutions.

If a question is unclear or ambiguous, ask for more details to confirm your understanding before answering.

If a mistake is made in a previous response, recognize and correct it.

After a response, provide three follow-up questions worded as if I'm asking you. Format in bold as Q1, Q2, and Q3. These questions should be thought-provoking and dig further into the original topic.


If at all possible, use a union all rather than a plain union to avoid an extra sort / unique node.

I've used that OR > UNION ALL trick a number of times to vastly improve performance on specific queries. It's crazy how much of an effect it can have.

I wish Postgres would implement a planner optimization to automatically run queries with an OR more efficiently (e.g. use the same plan as with a UNION/ALL where possible).


For number acuity, you might like this simple web I created a while ago. Have your child on your lap, open the link on your mobile, make sure volume is turned up, and tap the screen.

https://dots.twilam.com/


What's fun, and very interesting for both children and adults, is going zero tech. In fact, go back to prehistory.

You start with the different properties of stones. If you have flint, obsidian, granite, quartzite, gypsum, and calcite in your region -- find them together. If not, buy them. Teach your kids about their different properties, and how they were used to make hand tools.

Then, the different properties of woods. Hard, soft, green, etc. Show them why ash and hickory (and especially negatively buoyant cornus mas, if you can get it,) make much better tools than pine. Make wooden spears and harden their points in a fire you make with stone tools.

Then integrate the two -- use stone tools to make other stone tools, and combine stone and wood into wooden-handled stone tools. Make bows and stone-tipped arrows, and use them. Go foraging with the children, and teach them how to cook vegetables, fish, and meat over an open fire. (Note: Beware mushrooms unless you really know what you're doing.)

In short order, the children will understand how men have lived for hundreds of thousands of years. Then they can advance into copper smelting, pottery, building carts and canoes, making nets from natural fibers, writing on clay tablets, and so forth...

I feel that, as with math where the optimal method is to start with Euclid and then progress through the ages, one ought to learn to be in the world by moving through man's stages of development. At 4-7, they're in their prime for traipsing around the woods and making stone tools.


A significant portion of "household dust" comes from two sources: dead skin cells (great) and microplastics from textiles (wooo), like bedding, clothing, towels, rugs.

Rugs: You can get machine-washable wool or cotton rugs or jute rugs.

Clothing: cotton, linen, wool, natural leather

Bedding: cotton/linen sheets


There are 2 ways to design these. They could use a regular relay, or they could use a solid state relay.

Solid state relays have widespread fraud. Like 60% of the ones on amazon will catch fire or fail before they hit the rated current. Trade suppliers generally don't sell them at >30 amps.

Regular relays up to 10 amps are cheap and reliable. Beyond that, they get expensive surprisingly fast, and the reliability is hit or miss. They fail in numerous ways, but the most concerning one is the plastic case melting and catching fire. The chance of failure depends on the nature of the load (capacitive or inductive loads will dramatically shorten a relays lifespan).

In my professional career, I have witnessed ~20 of the above devices failing, with melted bits or burn marks, but of that sample none has burned down a building, yet. But I'd say that was more down to luck than good design.

In general, I would trust a china-device for monitoring power, but not for switching anything more than ~10 amps (1 outlet).


Sometimes this gets taken advantage of by the vendor. The government doesn't have the ability to design and specify things properly themselves when the deal is signed. The vendor might know that things are wrong at that point but wait until afterwards to point out all the problems and correctly claim changes in scope and requirements at significant expense. At least I was told that happened on the California rail project.

I used to work at an org that audits defense programs. A surprising amount of the unreasonable cost is self-inflicted, which is a valid cost to the vendor. If it wasn’t, the auditors have the power to clawback the excess cost.

In defense procurement the costs are frequently inflated greatly by the procurement process overhead and the government imposing last minute changes of scope, requirements, or delivery dates. The government customer is also often slow or delayed on their contracted deliverables, so they can end up spending a lot of money to essentially keep idle capacity warm on the vendor side while they sort out delivery of their part. And then there are the budget rug-pulls at the 11th after the vendor has already committed significant internal resources, which are often a pure loss to the vendor. All of this is endemic to the process. The government knows they are a difficult and expensive customer to work with, and they do try to compensate vendors for the overhead costs this imposes.

People like to talk about $2000 hammers etc but if you actually look at the audits, more often than not the cost was justified. Not because a hammer should cost that much but because that is how much it ends up costing after you account for the government procurement process overhead and the way in which the government executed the contract.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: