Hacker Newsnew | past | comments | ask | show | jobs | submit | singham's commentslogin

Daniel Dennett has been saying this for quite a while.


I have deleted thousands of google group messages and never found any problem. Maybe it is just for you.


Google groups is entirely different to Gmail in behaviour.


How are you going to recreate the main selling point of facebook, i.e. the Newsfeed ? If things are encrypted, how are you going to determine relevance, etc.


> How are you going to recreate the main selling point of facebook, i.e. the Newsfeed ?

When you post, the content is sent as an encrypted message that can only be decrypted by people who should be able to see it (either all of your friends or just a subset of them). The client automatically takes care of key management and is responsible for keeping track of sent & received posts, comments, etc; it renders a news-feed-like UI on top of that data. The end result is a familiar UX on the surface, but the underlying mechanism for transmitting data between users is far more secure than the traditional approach of using a monolithic database that contains everyone's data in plaintext.

> If things are encrypted, how are you going to determine relevance, etc.

The client is solely responsible for that. For now the "news feed" is strictly chronological, but I plan to augment it later by prioritizing posts that might be particularly interesting to the user. There are plenty of ways to make those decisions locally.


The selling point of facebook is not the newsfeed. The selling point is that every other person is already registered on facebook.


Your comment reminds me of the book "The End of Average". His main point is there is no such thing as the "average [blank].

[1] https://www.youtube.com/watch?v=4eBmyttcfU4 [2] https://www.youtube.com/watch?v=_cMXWcME_vQ


Thanks for the links. I'll try to have a look this Sunday.


Genetic Algorithms are used in designing of antenna arrays.



Consider the following case.

Let's say we ban development of self-driving cars since the algos can't be explained. Then we will never know the benefit we might reap from adopting them. We know humans make mistake. But with self-driving cars, the error rates in future might be very low as compared to human. We as a human society will never experience this future because we had this silly idealism that all algos must explain themselves in human terms.


Let's say we ban development of self-driving cars since the algos can't be explained.

That's a false dilemma since it's by no means proven that the algos CAN'T be explained - just that we don't know how to extract the explanation from the model yet. It's entirely reasonable to suspend real-world use until the maths catches up.


But we would never reap the benefits in the meantime. That was my main point. And this becomes crucial if the saving of deaths from accident is substantial as compared to deaths that happen due to AI errors.


Amazon will surely buy you once your product matures. I can see that happening.


I have used Nhibernate in my project at a company. But beyond simple queries we hit a performance issue. For a complex query, it took about 10 seconds to load the data to the frontend. So we had to rewrite the "code logic" ( the Nhibernate part ) into an "custom sql query" (Nhibernate allows you to write custom queries ) and then write custom mappers to map the results back to our application. The reason is Nhibernate does its processing objectwise and inmemory.


I cannot agree with this more. Universe has been existing without intelligent observers for billions of years. It stands to reason that universe is constantly being shaped as particles interact.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: