Hacker Newsnew | past | comments | ask | show | jobs | submit | hirvi74's commentslogin

> choosing to use AI isn't all or nothing.

That's how I have been using AI the entire time. I do not use Claude Code or Codex. I just use AI to ask questions instead of parsing the increasingly poor Google search results.

I just use the chat options in the web applications with manual copy/pasting back and forth if/when necessary. It's been wonderful because I feel quite productive, and I do not really have much of an AI dependency. I am still doing all of my work, but I can get a quicker answer to simple questions than parsing through a handful of outdated blogs and StackOverflow answers.

If I have learned one thing about programming computers in my career, it is that not all documentation (even official documentation) was created equally.


Same! I don't mind copy/pasting a code snippet or asking a question, and I also always ask it to show its sources for anything non-obvious. That alone cuts down on a lot of bullshit.

> nullable reference types.

What would you suggest instead? I quite like the nullable reference types, but I do know many get annoyed. My brain is often a scurry of squirrels, so I grew to become thankful for the nullable refs overtime.


I don't mind NRT but I hate dealing with C# projects that haven't set < Nullable>Enable</Nullable> in their csproj. It's not perfect because I know at runtime it can still be nullable but it's nice when the compiler does most of the checks for you.

The compiler now mostly solves this now but the abstraction is a little leaky.

I heavily use nullable types but I always want them to be declared nullable.


It's true. You will incur the wrath of C#'ers if your simple ToDo list app doesn't have a ToDoListItemRepositoryServiceFactory.cs and a minimum of 4 separate layers in which one must update 20 files because you added a property to one class.

Don't get me wrong, I still love C#/.NET. I use it everyday, but my god, has Swift been a breath of fresh air. The Swift community, when not whining about Swift UI, has been much less dogmatic in my experience.


Blazor is pretty cool if you are into that kind of stuff. Mind you, you still might need the slightest tiny dash of JS, but depending on your needs, you might be able to get away from JS entirely.

Somewhere its in my list of things to try one day. Maybe an ai agent can help make this happen.

> the "default" ORM is far more flexible and powerful

I have never used any ORM that is as capable. Entity Framework Core with Linq is what keeps me on .NET.


Because EF's default behavior implements Unit of Work, a LOT of the complex transactional spaghetti ended up disappearing when we switched.

This aspect of EF is highly underrated for complex entity graph mutations.

EF makes the 90% use case easy and the 10% case possible with very little pain. The interceptors, global conventions, and other extension points are an enabler of complex behaviors that are still transparent to most of the team.


> Because EF's default behavior implements Unit of Work, a LOT of the complex transactional spaghetti ended up disappearing when we switched.

I have never felt more understood by a fellow HN user. I think I know the exact spaghetti you are talking about, and I agree with you 100%. I wish EF could create (SQL) views, but it's not really any issue considering I can just use raw SQL to accomplish the same thing.

> complex entity graph mutations.

I'm too dumb to know what those words mean together. But I going to assume it's something to do with complex entities. If so, I completely agree as well. I sometimes harness the powers of the Dark Magics where my domain entities and database entities are the same objects (I'm not one of those DDD people either). Thanks to EF, I've been able to create some complex objects that really have cut down on a lot of useless objects I used to litter applications with.


Discriminated Unions is one thing I would like.

Just my hypothesis, but I wonder if larger sample sizes provide a more diverse population.

A study with 1000 individuals is likely a poor representation of a species of 8.2 billion. I understand that studies try to their best to use a diverse population, but I often question how successful many studies are at this endeavor.


use a diverse population

If that's the case, we should question whether different homogeneous population groups respond differently to the substance under test. After all, we don't want to know the "average temperature of patients in a hospital", do we?


> If that's the case, we should question whether different homogeneous population groups respond differently to the substance under test.

In terms of psychological treatments, I am honestly in support of this. Many mental illnesses can have a cultural component to them.

> After all, we don't want to know the "average temperature of patients in a hospital", do we?

No, I don't think we do. Am I understanding you correctly?


I started back on Vitamin-D and Omega-3 (with extra EPA & DHA) about 2 months ago or so. Sadly, I don't feel much of a difference, but I believe the potential harms are quite minimal, so I am going to maintain supplementation for the purposes of self-experimentation.

It took me more than 3 months to get to the lower edge of the desired range with my blood level on about 8000 UI daily. There was no instant improvement, it takes time.

Do keep your levels in check though.


"A man cannot step into the same river twice, for it is not the same river, and he is not same man."

- Heraclitus


> assignment to rewarding work, get paid top dollar, not be bored, get recognition for success, coaching on career growth, given leeway to make mistakes, not overlooked for promotion, etc.

How likely is one to find all of the above in a job? My current job is essentially the opposite of all of those items. Though, believe it or not, it's not a bad place to work. Just very old school and non-tech focused.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: