Hacker Newsnew | past | comments | ask | show | jobs | submit | vamos_davai's commentslogin

I'm ready to fight for my 401k to defend DRAM cartels


"To simplify it greatly, an LLM neuron is a single input single output function". This is very wrong unless I'm mistaken. A synthetic neuron is multiple input single output.


Ten thousands of extremely complex analog inputs, one output with several thousand of targets that MIGHT receive the output with different timing and quality.

One neuron is ufathomably complex. It‘s offensive to biology to call a cell in a mathematical matrix neuron.


There's no difference between conspiracy and hypothesis.


You mean "conspiracy theory" and "hypothesis". And yes, there fundamentally is a difference between theory and hypothesis:

https://www.merriam-webster.com/words-at-play/difference-bet...


> Outside of scientific reasoning, "theory" and "hypothesis" are often used interchangeably, and "theory' can unfortunately be interpreted to mean "less sound" or "lightly speculated."


Brave | Full-time | Remote | Senior Software Engineer (Ruby on Rails)

Brave is looking for an experienced Senior Software Engineer who can lead development on our Ruby on Rails based Brave Rewards Creators app. We are looking for somebody who always prefers a simple solution over a complex one and who can take solutions from end to end with their full stack skills.

Required qualifications: * Strong Ruby on Rails expertise and experience * Experience with JavaScript and other frontend technologies (e.g. CSS, React) * Experience with SQL and Redis * Experience with software development via distributed development teams * Comfortable working in an open source setting * A passion for helping protect users’ privacy and security * Written and verbal communication skills in fluent English * Proven record of getting things done

This is just one of many jobs that you can read about in https://brave.com/careers/


Brave | Software Engineer (Ruby on Rails) | REMOTE US/Canada | Full-time | brave.com Brave is looking for an experienced Software Engineer to work on Ruby on Rails publisher app. This is a high­ profile and impactful, hands­-on position in an early stage startup. We’re primarily looking for someone with strong front-end skills.

Requirements

2+ years experience with Ruby on Rails experience. Working experience with JavaScript Enthusiasm and familiarity with blockchain Experience with software development via distributed development teams Comfortable working in an open source setting A passion for helping protect users’ privacy and security Written and verbal communication skills in fluent English Proven record of getting things done

https://brave.com/jobs?gh_jid=1211193 and see our other listings on https://brave.com/jobs


Go to reddit if you want to make average IQ humor.


Wouldn't it make more sense to optimize memristive technologies?


I don't really understand quantum computing, and hopefully someone would shed light and let me know if my understanding is wrong. Isn't a qubit similar to a transistor that can hold more than 2 states and the implication is that it'll be faster than current SIMD operations?


It's weirder than that. Some quantum algorithms offer a big-O speedup over classical ones. Shor's algorithm is O(n), which is categorically faster than any known classical algorithm for integer factoring. It's polynomial time, whereas the fastest classical algorithm is subexponential. For every subexponential algorithm, there will be some n at which Shor's algorithm will be faster, and some slightly higher n at which it's much, much, much faster.

https://en.wikipedia.org/wiki/Shor%27s_algorithm


In no way am I qualified to talk about this with any sort of certainty, but I have been studying EE for the past 2 years and live in a house with a physicist who is probably far more well read in this topic. Classical computers are nice because they are predictable; if you want to order some product online, it would't be very helpful if the bit stream of data coming into your computer was fuzzy and muddled, making it hard to tell a 0 or a 1 apart. Quantum computing takes advantage of quantum properties such as the uncertainty principle and quantum entanglement to store bits in a state of being either a 0 or 1 until we "observe" the bit coming through (quantum bits -> qubit). This allows for algorithms, experiments, and other things quantum to be ran much faster than any classical computer since it can encode more data into a single bit. To minimize the fuzz in a q-computer, they freeze the sh*t out of it to keep the energy at a minimum, along with using conducting materials which excel at conducting heat away (like diamonds). I believe the spin of an electron is what determines the qubit value and when used in conjunction with quantum entanglement will affect other electrons that are already entangled. My little knowledge ends here but with this thought on quantum entanglement, you can see how flipping bits values respectively can allow for more ways to store information with the same amount of "stuff" remaining constant vs. a classical computer.


A quantum computer doesn't work much like a classical computer at all, so any analogy to transistors fails pretty quickly.

One key aspect of that is that all of the qubits are entangled. The qubit isn't in more than two states, but in 2 states at once, and N qubits are in 2^N states at once. It's not quite as simple as that: you can't really just set up any algorithm in a quantum computer the way you would with a classical one. But for those problems amenable to quantum computation (including, notably, prime factoring), they can be solved very fast.

With 2^53 qubits we should be able to factor some very, very large prime numbers.


You will find that this machine can't even factor the number 15 properly, as it's not error corrected, or a general purpose quantum computer. FWIIW nobody has factored the number 15 using quantum algorithms using the Shor algorithm yet; only using the subset of gates they know produces the numbers 3 and 5.


For those who are interested in the papers about this topic:

https://arxiv.org/abs/1301.7007

"Pretending to factor large numbers on a quantum computer (2013)"

"Of course this should not be considered a serious demonstration of Shor’s algorithm. It does, however, illustrate the danger in “compiled” demonstrations of Shor’s algorithm. To varying degrees, all previous factorization experiments have benefited from this artifice. While there is no objection to having a classical compiler help design a quantum circuit (indeed, probably all quantum computers will function in this way), it is not legitimate for a compiler to know the answer to the problem being solved. To even call such a procedure compilation is an abuse of language."

More references:

https://crypto.stackexchange.com/questions/59795/largest-int...


You keep repeating this, but as best as I can tell, it hasn't been factually accurate in a decade, unless you really want to quibble over the fact that they finished the calculation on a general purpose computer.

https://www.schneier.com/blog/archives/2009/09/quantum_compu...

>Instead, it came up with an answer to the "order-finding routine," the "computationally hard" part of Shor's algorithm that requires a quantum calculation to solve the problem in a reasonable amount of time.

They weren't the first team to do this, either, they just miniaturized parts of the hardware.


The result you cite doesn't even say what you think it says; in fact it directly says they didn't factor a damn thing.

If I'm wrong and IBM's latest "quantum computer" can actually do the Shor algorithm on the number 15 using 4608 gate operations, I will publically eat one of my shoes like Werner Herzog. Assuming I can get someone else to take the other side of the bet, of course.


It says:

> the chip itself didn't just spit out 5 and 3. Instead, it came up with an answer to the "order-finding routine"

O noes! The quantum computer only did the "order-finding" part of Shor's algorithm! ... But wait. Here's the Wikipedia page for Shor's algorithm:

> Shor's algorithm consists of two parts: 1. A reduction, which can be done on a classical computer, of the factoring problem to the problem of order-finding. 2. A quantum algorithm to solve the order-finding problem.

So, the article GP cited says that a quantum computer did the part of factoring the number 15 that actually uses a quantum computer. Nothing wrong with that.

The article's link to the actual paper is broken, which makes it harder to tell whether as you say they cheated somehow, but here's another article about factoring 15 with a quantum computer https://science.sciencemag.org/content/351/6277/1068 which so far as I can see claims to have actually done the whole of (the relevant part of) Shor's algorithm on actual QC hardware.


Even assuming I humor you and refuse to notice they left out the gates to the wrong answer, and assuming I humor the authors in agreeing that this is a scalable form of the Shor algorithm (I deny both of these for the record) .... congratulations: by 2016 someone was able to factor the number 15. I guess quantum supremacy is right around the corner!


>to notice they left out the gates to the wrong answer

Can you explain this a little more? I've seen similar references to this elsewhere. If it means what I think it means, it's kind of a really big deal. But it may mean something else.



That doesn't explain in what sense this entirely different paper from 20 years later allegedly "left out the gates to the wrong answer".

The paper I linked to says, e.g., the following:

> Subsequent multipliers can similarly be replaced with maps by considering only possible outputs of the previous multiplications. However, using such maps will become intractable, [...] Thus, controlled full modular multipliers should be implemented.

So in at least one case they are explicitly not taking a particular shortcut because it doesn't scale to factorizing larger numbers. If you say they are taking other shortcuts that don't scale by "leaving out the gates to the wrong answer", then I think you owe us an actual explanation of what shortcuts they are taking and how you know you're taking them, rather than just a link to a paper from 1996 that says how to take some shortcuts.


Frankly I don't owe you or the muppets attempting to participate in this thread by zombie-walking "MUH PRESS RELEASES" a damn thing: if you want to believe in pixie dust, you're free to do so. None of the "quantum computing results" factoring the number 15 have done the actual Shor algorithm -they've all used the shortcut described in this paper. Someone below posted another paper pointing out the same thing, as well as some discussion on a forum .... pointing out the same thing.

It's not my fault you believe in press releases without understanding what they mean.


I haven't said a thing about press releases.

The paper I linked to (1) doesn't cite Preskill et al and (2) explicitly claims not to be taking shortcuts that don't generalize to numbers other than 15; as well as the bit I quoted earlier, they say "for a demonstration of Shor’s algorithm in a scalable manner, special care must be taken to not oversimplify the implementation—for instance, by employing knowledge about the solution before the actual experimental application" and cite an article in Nature decrying cheaty oversimplifications of Shor's algorithm.

I don't see anything in their description of what they do that seems to me to match your talk of "deleting the gates that lead to the wrong answer".

(The Kitaev paper they cite also doesn't cite Preskill et al, unsurprisingly since it predates that, and also doesn't contain anything that looks to me like cheaty shortcut-taking.)

It is, of course, possible that that paper does take cheaty shortcuts and I've missed them. It is, of course, possible that its authors are flatly lying about what they're doing, and trying to hide the evidence by not citing important prior papers that told them how to do it. If so, perhaps you could show us where.

Otherwise, I for one will be concluding from the surfeit of bluster and absence of actual information in your comments so far that you're just assuming that every alleged "factoring of 15" is indulging in the dishonesty you think they are, and that you aren't interested in actually checking.

(You don't, indeed, owe anyone anything. It's just that if you want to be taken seriously, that's more likely if you offer something other than sneering and bluster.)


Again, as I said before: even assuming they didn't take shortcuts (the paper you mention is essentially "we took shortcut X instead of Y" -and of course, no quantum error correction is evident): congratulations, your miracle technology is now capable of factoring the number 15. All they have to do now is add all the things that make quantum computing useful and eventually they will honestly be able to factor the number 21. Should happen ... I dunno, care to make a prediction when?

Pointing this out is apparently necessary; I don't know why it triggers people so to point out that virtually the entire field up to the present day has consisted of grandstanding quasi-frauds. And that you apparently have to understand things and read extremely carefully to notice, because whatever honest workers there may be don't see it as in their interest to point such things out as it may upset their rice bowls.

You have someone else in another thread insisting that annealing can factor giant prime numbers which is equally bullshit. Do you expect me to patiently, precisely and (somehow) dispassionately point out every line of bullshit in every quantum computing paper published? The mere fact that the field is pervasive with bullshit, publishes papers and announcements that are known to be bullshit, and promises all kinds of pixie dust bullshit on a regular basis ought to give you some slight skepticism, some Bayesian prior that the grand pronunciamentos of this clown car should be treated with a bit of skepticism.


I agree that "can factor 15" isn't terribly impressive. (Well, it's kinda impressive, because making a quantum computer do anything at all is really hard.) I very much agree that D-Wave's quantum annealing stuff is bullshit, especially if anyone is claiming it's any good for factoring large numbers. I don't expect you, or anyone, to point out every bit of bullshit in every paper published.

I'm all in favour of pointing out bullshit. But there's a boy-who-cried-wolf problem if you just indiscriminately claim that everything is the same kind of bullshit without checking it.

Of course that doesn't mean that you're obliged to check everything. You can say "I expect this is bullshit of the usual sort but haven't checked". But if you say "this is bullshit of the usual sort" without checking and it turns out that that isn't the case (it looks to me as if the paper I linked to isn't the kind of bullshit you describe) then you take a credibility hit that makes all your bullshit-spotting much less useful than if you were more careful.


God, you've got a bone to pick, don't you? I see you every time there's a HN thread about quantum computing. Scott Aaronsen posting under a sock account?


Unlike you, I post under my name. Aaronson doesn't know what he's talking about half the time.

Some people laugh at the chicanery of scumbags who claim fully autonomous vehicles are right around the corner. I laugh at the frauds and mountebanks of "quantum information theory" and the muppets who believe everything they say. De Gustibus.


I changed my mind, you called someone a bugman in another thread, so while you're wrong about the current state of QC you're alright in my books


An explanation in comic form with text by quantum computing researcher Scott Aaronson: https://www.smbc-comics.com/comic/the-talk-3

QC is about setting up interference patterns between the qbits, it is fundamentally unlike classical computing. For some problems algorithms can be designed that use those interferences to compute things, like the famous Shor's algorithm for polynomial time prime factoring.

QC speeds things up only in the cases where an asymptotically faster algorithm can be designed this way, it is not a general purpose parallelization mechanism.

And we don't know many things it speeds up for sure! For example: we don't actually know that there isn't a classical polynomial time prime factors algorithm we haven't found. Here is a recent example of a quantum algorithm leading to the discovery of a classical algorithm that is equivalent to the quantum: https://www.quantamagazine.org/teenager-finds-classical-alte...


> Isn't a qubit similar to a transistor that can hold more than 2 states ...

Not really. From an information theory perspective, holding two states is the most efficient way of computing. Or at least, there is no speed-up gained from doing computations in any other base than 2.

Quantum computers can be in both states at the same time (as far as that interpretation of quantum mechanics goes). So, if you keep the qubits all in this superposition state, you can calculate multiple things at the same time.


>From an information theory perspective, holding two states is the most efficient way of computing.

Assuming the cost of operations on each digit is a multiple of its number of states, the most efficient base for computing with arbitrary numbers is 3, which is the closest integer to the optimal base e:

https://en.wikipedia.org/wiki/Radix_economy

https://en.wikipedia.org/wiki/Ternary_computer


Yes, you are correct. I said it all wrong.

From a physical perspective, holding two states is the most efficient way of computing. I.e. the electronics required to compute in bases other than 2 involve higher power and present difficult challenges.


Since other people have already explained how qubits are not at all similar to transistors, I'd like to mention that it is not comparable at all to SIMD and there is no indication that quantum computers will be better at vector math than classical computers.


I found this video to be a great explanation https://www.youtube.com/watch?v=OWJCfOvochA

Explains to 5 different levels of CS knowledge.


The gist of it is:

>> that can hold more than 2 states

--> that can hold more than 2 states AT THE SAME TIME (in weird quantum conditions)

So you're not trying 00, 01, 10, 11; you're trying [00|01|10 |11] at the same time, as well as other state in between 0 and 1.


I investigated into developing in Half Moon Bay, which is zoned by San Mateo County. There's wide swathes of land up for grabs, but the county or the council forbids development unless you buy enough land and come up with a proposal. This ends up costing in $5-$10 million of undeveloped land alone (no sewage line connection and no electricity). And these lands are planned unit developments, which in practicality means homes only for the wealthy.


San Mateo and Marin have always opposed development. It has given us great open spaces up 101 down 280 and along 1. Its an odd position to defend, no development in Marin and San Mateo, but increased developement in San Francisco yet thats my view


Even in the spaces that are not open space San Mateo opposes apartments.


Their platform, their voice. It's not part of public domain. Unreasonable outrage from the HN crowd.


Just because you can do something does not make acceptable to all people.

What you are talking about is essentially moral relativism. Most people see that as dubious.

Just because one entity sees something as right does not make it objectively right.

Can Instagram do this? Yes. Should everyone be okay with it in virtue of that ability? That comes down to personal opinion.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: