I see this so often. It's how terrible software is written because people are afraid to change direction or learn anything new mid project.
I rewrite most of my code 2-3 times before I'm done and I'm still 5x faster than anyone else, and significantly higher quality and maintainability as well. People spend twice as long writing the ugliest, hackiest code as they would have to just learn to do it right
Oops, i misread the word. I read it as “peeping” as in they were caught trying to see children as they dress or undress. But yeah reading the word correctly now what I wrote doesn’t make much sense.
The main issue with powershell is its name. It would have been better named powerscript. It’s not really a shell but a much more powerful script language along the lines of Perl or PHP.
You can hash them without a salt and store them in a set of passwords not associated to user accounts to enforce uniqueness without having to actually know the passwords
That still introduces a fairly serious vulnerability. The lack of salting on the "password uniqueness" database makes it a juicy target; an attacker with access to the database can attack those passwords, then try the ones which are known to be valid from there against the salted passwords in the user database.
I wonder if there’s some way to mitigate this by either only keeping the uniqueness database long enough to identify duplicates and then deleting it or by using this on lower priority systems that people may reuse passwords from for your higher security one. In either case, the small number of bad passwords you would identify that you couldn’t come up with yourself or find on common password lists probably makes this a bad tradeoff.
There is/was a school of thought that each user should have their own database account, and the application should connect to the database as that user. The advantage being you can use the database's built-in user and role management and privileges instead of having to invent your own. I have admittedly not seen this done much, but there is a certain appeal to it.
>I feel the major issue with excel (and other stuff such as CSS) is that one learns by cobbling things together and never through a formal process.
This is literally all programming, and it's not necessarily a bad thing. As long as you keep learning and don't just keep doing the same cobbled together mess for years and years
To Ruby’s credit, they foresaw what others bolted on years if not decades after.
Remember Python didn’t start with being fully object oriented, everything is not exactly an object, they bolted on the useful functional stuff later (like every language has now after a weird period of people crapping on FP for some reason) and to top it off bundler, and Ruby version manager again were just largely copied over to Python as pip and venv. I like both languages, and I say this after getting schooled a few times about great things I thought Python did that I was a few times rather embarrassingly shown to have just been Ruby ideas picked up by others.
I’ll give you the language looks a bit funny and I’m not saying it’s better or anything, I work in Ruby and am all too aware of the warts. Just trying to share what I’ve learned that they did well because it’s a good and thoughtful community.
> Remember Python didn’t start with being fully object oriented, everything is not exactly an object, they bolted on the useful functional stuff later (like every language has now after a weird period of people crapping on FP for some reason)
None of that is actually true, to the extent that it makes any sense (which it doesn't really). Python had a complete object system and first class functions pretty much from the first preview, and anonymous functions (lambda), map, filter, and reduce were added before 1.0 was cut. Which predates the first public preview of Ruby.
> to top it off bundler, and Ruby version manager again were just largely copied over to Python as pip and venv.
And that is complete nonsense, virtualenvs have nothing to do with rvm, and pip is in no way a copy of bundler (not that it's in any way exceptional, it's a package manager).
> I say this after getting schooled a few times about great things I thought Python did that I was a few times rather embarrassingly shown to have just been Ruby ideas picked up by others.
So after getting told you were spouting nonsense one way you went on to spout nonsense the other way?
I believe you that pip has no relation to bundler and venvs have no relation to rvm or rbenv because bundler and rbenv are so much more intuitive and useful.
Poetry gets very close through, but it took decades to get there and afaik it is still not viewed as an "official" part of the toolset.
Venv tries to do too much and manages to not be great at anything, which is a direct contributing factor to Python being worst in class by a wide margin when it comes to packaging and dev environments.
I have the exact opposite experience, bundler + rbenv are so much nicer to use than pip and venv. rbenv automatically loads the correct ruby version when I enter a project directory and bundler is just so much more capable and intuitive than pip.
I never really liked Python much personally, I think it sits at an uncomfortable spot where it's still too verbose compared to other dynamic languages but not performant enough compared to static languages.
If I cared more about the expressiveness and the speed of prototyping at the expense of everything else, I would rather use Ruby which has much more stuff built-in and I'd accept to pay the price of its runtime penalty.
But if I cared more about performance, I'll rather use a static language like Go or Rust.
If I have to pay the dynamic language performance & maintenance tax, it needs to be worth it and I need to get some big advantages in return, otherwise I'd just use Go.
The few places I need more performance, I just use FFI and contain the tiny bits needed outside of Ruby. I wrote my MSc on using statistical approached to improve ocr error rates, and 99.9% or so of the code was Ruby - I needed to translate a few dozens of lines of code to C e.g. a knn implementation after I'd proven it worked.
In 18 years of Ruby use, this has been typical. E.g. I've done large scale (tens of thousands of layers) map tiles rendering in Ruby, and only a tiny core of the final rasterisation code was worth rewriting even back then at a time when Ruby was far slower.
The opinion is blown out of proportion, but there is a hint of truth. Large python projects are indeed easier to follow simply because of type annotations and signatures. Of course a lot of projects don't follow type conventions and try to invent their own optional parameter syntax using dictionaries and keyword expansion syntax, so the problem still exists everywhere.
Ruby has RBS and sorbet to support using type annotations. They are each relatively new and aren’t as clean and well integrated as Python’s implementation, but it’s not as if type annotations in Ruby don’t exist.
It does exist but they are not embraced by the majority of the Ruby community.
Unless type annotations are treated like first class citizen in the language, it won't be good enough. My theory is that those in the community wanting static types went to Go or Rust.
Of course anything can be done with RBS, but I think it came wayyy too late. Python type system is already given time to evolve and survive in the wild.
A second point is IDE support. It is so hard to get started with ruby auto-format, code completion, ctrl-click to follow code and debug. Python is readily usable in pycharm community edition.
And also produces so much waste since almost no one writing python thinks at all about performance leading to some of the dumbest looking architectures i have seen to scale up to even a small amount of requests.
I wouldn’t be so harsh, as to call it gross, but I also much prefer Python, because Ruby reminds me of Perl. It feels clever, but not in a way that I expect to shorten its BNF. It still bugs me a bit, that ruby has pascals ‘end’, and Python uses whitespace, but it worked out in the real world.
Given just the syntax, I would always recommend Python as a first language to scientists in a lab, rather than ruby. The code just reads and writes itself better, without special characters.
But, I think it’s not fair to call Ruby gross, given some people love C++, php, bash, JavaScript… I’d take ruby over many languages, given a choice.
Syntax is such a minor detail, I don't know why people care about it so much (unless it's APL or something similarly exotic).
The much bigger elephant in the room is the semantics. My personal pet peeve is that Ruby, just like Perl or C, doesn't have any sort of file-based isolation. While importing something in Python normally doesn't mess up any namespace except for the stuff you've just imported, Ruby basically leaves this to programmers' and they create monstrosities where one require statement can do way too much magic to my liking. And while there are some libraries and frameworks that are closer to Python in spirit ("explicit is better than implicit"), Rails is something that really throws me off as most things just magically happen to work with some incantation that seemingly comes out of thin air.
This new autocomplete thing may be a real breakthrough for people who learn by getting their hands dirty and trying to write something, probing around the available methods and functions to find the appropriate one. If it can suggest what's possible/available, a lot of the magic may fade away and become proper, explainable hard science.
What "special characters" are you talking about in Ruby? My impression is that Ruby and Python are roughly equivalent in terms of non-alphanumeric characters used in syntax.
Some symbols I can think of that Ruby uses that python doesn’t:
$ for global vars
@ for instance vars
:: for namespace stuff
=> for map key, value separator
{||} for blocks
.. and … for ranges
%w for special array construction
: for symbols
? and - in method names
?: for ternary expressions
#{} for string interpolation
And python that Ruby doesn’t have:
:: for slices
@ for decorators
And for the symbols that they both share, subjectively, a lot of them are used for often in Ruby.
Ruby does have the block related stuff like the & argument and the single line block { }. But other than that I also think it's relatively similar to Python (which doesn't even support blocks anyway).
From my perspective (DevOps/SRE) Ruby is a horrible platform. It is heavy on resources, it is difficult to run (Unicorn is a pain), maintain, monitor, debug. Many Ruby projects has silently failed (Chef? Puppet?) and the biggest Ruby tool in DevOps world which is Gitlab is incredibly difficult to run on premise and struggles with a ton of issues that I believe are caused by the platform.
If I had a choice, I would never work on a ruby project.
As both a Ruby programmer and an DevOps/Infrastructure in the past, I have found Ruby to be a superlative programming environment but just as easy to program in badly as any other. I have never found it harder to run, monitor or debug than any other platform, certainly not harder than building and tuning a Java server platform. Really I think Ruby has been the biggest influence for more brilliant tools and practices in modern development than anything else, even if it doesn’t come close to the performance of a static language.
You're spot on. You can see its influence in a lot of tooling coming after, such as several package managers (from yarn to cargo), java collections syntax, go structural typing, python's gunicorn, JVMs invokedynamic (introduced for jruby originally), among others I can't remember from the top of my head. Several new languages were created, or benefit from the collaboration of ex-rubyists, from elixir to rust to node, which greatly influenced their approach to developer ergonomics. Even if the world would refuse to stop using ruby forever (which will never happen, why would it...), its influence would last for a long time, after which it'd be rediscovered again after the mandatory forgetfulness cycle.
Some great examples there, and look at how many frameworks are “$LANG on Rails”!
Rubyists introduced me to automated provisioning and deployment - Puppet, Chef, Capistrano - as well as concepts like test-driven-development and the genius of metaprogramming, but it’s common to hear javascripters waxing lyrical about TDD while slagging off Ruby.
Ruby is like 12-Bar Blues: people who love rock music don’t always like hearing blues. My favourite story is about seeing Earl Slick and Bernard Fowler performing Bowie, and they struck up a long bluesy intro which caused one of the two older fellas standing next to me at the bar to turn to me and say, “I really don’t like all this blues crap!” only for the “blues crap” to become The Jean Genie two seconds later.
Careful what threads you try and unravel as they may weave your own narrative…
From my perspective of devops, I've built large hybrid cloud deployments in Ruby (from scratch, including an orchestrator written in Ruby) and would again given the right requirements, as for tooling the ease of writing Ruby outweigh anything else, and for running it it is no different or more complicated than any other container workload. It's a bizarre objection.
Whether you write unreadable code or not is not a function of language. This is a lazy attempt at criticism. My preference for Ruby is in part because reading and understanding well written Ruby is a joy compared to every other of the dozens of languages I've used.
And since (un)readable code is not a function of language, you cannot state in your next sentence that reading well written Ruby is joy in comparison with other languages.
Of course I can. There is no contradiction there. You can write readable code in any language, even assembly, but that does not mean well written code in a language that is also particularly readable won't be more of a joy to read than well written code in a verbose or hard to read language.
> The first statement is provably false, empirically, as huge systems are overwhelmingly not written in dynamically typed languages.
Your conclusion is not supported by the claim you try to support it with.
> Ruby is not one of these languages.
You're free you think so, but you've not provided anything but unsupported conjecture and logically invalid reasoning to support your belief, so rather than convince me, you've provided an additional reason to question your judgement.
What do you mean by huge systems? Many of the largest web platforms were indeed built with dynamically types languages. And these days Javascript somehow ends up being used for almost everything you can think except an OS kernel.
Chef's problems "with ruby" were largely design problems. The whole structure of the run collection and the "two pass parsing" design made it so that users had to almost immediately fully understand how the ruby parser saw ruby code. That wasn't really ruby's fault and there could have been other ways to structure recipes and avoid smacking new users with ruby syntax quite so hard.
I have no idea what you are trying to say, but my best guess is that you are referring to a case where one Django application is coordinating with 24 other services in some fashion?
Per fallingknife's definition, that's still a monolith. No different than a Django application coordinating with MySQL. If it is that you think 25 > 1 is somehow significant, throw in Stripe, GPT, etc. in addition to MySQL. He would still call that a monolith. It's all the same.
That would not be a monolith by my definition as I say that coordination removes software from being a monolith, but we've long moved past my definition. We're only here now to understand where fallingknife's definition allows for there to be anything other than monoliths.
I rewrite most of my code 2-3 times before I'm done and I'm still 5x faster than anyone else, and significantly higher quality and maintainability as well. People spend twice as long writing the ugliest, hackiest code as they would have to just learn to do it right