Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Documentation. There’s a common mis-conception that type systems make code verbose. Firstly, type inference is pretty doable and addresses the problems that give rise to these beliefs. Java and C++ aren’t verbose because they have type systems, they’re verbose (partially) because they have bad type systems. Second, in my experience, actually clearly documenting an interface in a language that doesn’t have types ends up being much more verbose than it would otherwise be. You need docstrings regardless, but if the type of the thing says it’s an integer, you don’t have to write “x is an integer” in the docstring too. And now you have a concise, agreed upon notation for that sort of fact.

Preach it. Also, you don't have to worry that a method whose signature is `public int DoWork(string input)` actually expects char[] as its input and returns a long, the way you do when the types are only documented through comments.



Except Python is not only used for what you do. People use it for batch testing, SIG embeded language, sciencitic analysis, sys admin, 3D tool scripting, glue code, product pipelines, etc.

Having a required type system for this huge part of the community would be a great let down.

Those criticism come from people comming from a strong dev background. They completly ignore the rest of the world.

That's why having the current OPTIONAL type system is so good. Because you can use it and benefit from it, and many do, but you don't screw half of your community.

Python strength is versatility. It's not the best at anything. But it's very good at most things.

That's why if you know it, you can solve most problems decently. That's what makes it fantastic.

I'm still waiting for people doing data anaylsis in Go, sysadmin in JS, 3D scripting in Haskell, batch testing in erlang or geography in lisp.

Because Python... well it can. And AI. And Web. And embeded systems. And IoT. And... And...


OK, sure, but people are pitching Python not just as a language to write scripts but also one to do development of large projects on large teams, and I think this property makes it somewhat ill suited for that.


I wrote plenty of big projects in Python. It works very well, the problems and solutions you have are just different that in other languages you are used to.

The question is more about "what kind of problem do you want to have and what price are you willing to pay to solve them ?". After that, choose the tech that matches this wish.

For me the iteration speed, the ease of adding new members and the flexibility are key.

But what I discovered is that most people just code bad Python and say they have problems, just like most people write bad C/java/whatever and blame the language.

You are supposed to writte Python in a certain way, like any language, to scale.


It is of course possible to build a great large product in Python, just like it is possible to do so in whatever your least favorite language is. But I'd argue there is more discipline needed to do so than there is in other languages with more type checking and there are whole classes of errors you simply wouldn't have to worry about.


I have not really found this to be the case. I work daily on a codebase of a few hundred thousand lines of Python code with a hundred dependencies and type errors are usually solved quickly.

For small, local needs to pass around data, a plain dict or tuple usually suffices. If you need stricter contracts, you define good classes and interfaces.

There's a need for discipline in your coding style, but I've frankly not found this to be an issue with moderately competent coders who understand this. Yes, you can shoot yourself in the foot with this, but if anything those bugs are usually obvious and solved in the first pass of testing.


> I work daily on a codebase of a few hundred thousand lines of Python code with a hundred dependencies and type errors are usually solved quickly.

Yeah, but they're often solved after the code goes into production and doesn't work, rather than before the code is ever committed.


"Language X allows you to be more undisciplined" doesn't sound like a great selling point for those languages.


Would you perhaps mind giving your opinion of Ruby?


I had the choice between Ruby and Python years ago. I choose Python because of the community, not the language. They are more or less equivalent, although I prefer forced indentation.

Nowaday, ruby is dying in his RoR and Japan niches so the question is moot.


> batch testing, SIG embeded language, sciencitic analysis, sys admin, 3D tool scripting, glue code, product pipelines, etc.

> Having a required type system for this huge part of the community would be a great let down.

... Why? These days types generate much more code than they cost in modern FP languages. E.g., https://github.com/haskell-servant/example-servant-minimal/b...

> That's why if you know it, you can solve most problems decently. That's what makes it fantastic.

Most of the things you're talking about are available in other languages. And quite frankly, if pure versatility and available open sourced code is your argument? Then why aren't you using Javascript and nodejs?

Name a general domain that I can't find at least one or more well-maintained projects supported on NPM. I dare you.

> And embeded systems.

Who.. who is doing IoT and Esys in Python outside of toymakers?


You may be able to find a well-maintained data science/machine learning project on npm, but I doubt you'll find as many as are needed in a typical data science workflow.


Maybe, not sure how that is relevant. You definitely CAN do it, it's just not as common. Same could be said of F#.


> Name a general domain that I can't find at least one or more well-maintained projects supported on NPM. I dare you.

one is named

> Maybe, not sure how that is relevant.

Man, it was your dare!


You didn't do what I said at all. Everything you need is there, it's just not as popular.

Please tell me what pieces you need and I'll do my best to make good on my claim. At a minimum, tensorflow, nltk and spark bindings exist. And in fact, the popular notebook software packages are reaching out to other runtimes already.


That's far from where the bulk of time is spent for most data science workflows. You need a pandas/dplyr and you need a ggplot2/[long list of Python plotting libraries here]. You mentioned F#, which has Deedle for data frames. What's JavaScript/NPM's story on this?



> who is doing IoT and Esys in Python outside of toymakers?

Aren't most IoT things basically toys anyway?


Zing.


NPM has way more garbage and moving targets than similar Python packages.


This is hardly a rebuttal.


Being able to pick a time-tested, reliable package that you can suspect will be maintained the next few years forward has incredibly value. The NPM ecosystem does not provide it.


I can point to a full suite of data science projects in npm with varying lifespans from 1-3 years. Given the null assumption (it will be maintained about as long as it has been maintained so far), this seems to meet your requirement?

That NPM has churn is an example of how massive that ecosystem is compared to PIP, which is comparatively microscopic and is much more dependent on specific corporate actors to continue investing.


What data science project in NPM are people actually using? There are literally thousands of people using NumPy/SciPy + Matplotlib. NumPy, BTW, is over 10 years old and counting.

Javascript is a terrible to do data analysis, given its inferior numeric data types.


> Javascript is a terrible to do data analysis, given its inferior numeric data types.

I'm still amazed that ya'll put up with Python's trashy numeric tower, coming from other contexts.


Automatic bignum promotion is exactly the intuitive behavior one wants for numeric types.


Micropython on the ESP32 is pretty awesome. It may not be getting serious use yet but I'd say it's on the verge of something big.


How would a type system hinder those people?


Most of those people are not dev. They don't write libraries, don't have time to learn many docs or type many lines.

They have 30x 200 lines longs scripts and couldn't care less about code quality. They just want the result.


I am unable to find an answer to my question in what you just said.


Path of least resistence. A strong type system is much more extra work than you think.


That might be true at the start of the project. But to be honest, having worked quite a few years with strong type systems, weak type systems and dynamic type systems, I've found that not having a strong type system is precisely what makes work grow faster than expected. Maybe it's because I'm accustomed to big projects, but I've been bitten too many times. I'd even choose Java over any language that doesn't have a dependable type system.


"having worked quite a few years with strong type systems, weak type systems and dynamic type systems"

Sorry to be pedantic, but you probably mean static type systems, not strong type systems.

See: "What to know before debating type systems" - http://blogs.perl.org/users/ovid/2010/08/what-to-know-before...


I have worked with strong and no type systems, and have found that the absence of a type system is extra work.

Instead of the compiler telling me exactly what something is, what I can do with it, and if it makes sense to pass it to some function, I have to figure all of that out myself, greatly slowing me down.

Edit: It occurs to me I have yet to hear an answer to my question of how types hinder anything, especially if they are inferred. The only exception is the trivial requirement of sometimes having to cast between numeric types.


I really like type systems. I think that if you take the time to learn about type theory, then you are more likely to create better solutions (both with code and in life in general).

However, it isn't free. Type theory is a kind of math that most people have very little exposure to, so there's going to be a lot of work in order to start becoming proficient.

Additionally, there's more than one type of type theory. Are you using a system F like system? Are you going to have sub-typing? Is it going to be structural sub-typing? Maybe you want row polymorphism. Is there going to be a kind system? What about higher order poly kinds? Dependent typing? Intensional or extensional?

Additionally, there's more than one type of implementation of these type systems. Ocaml functors ... is it a generative or applicative functor? Haskell ... are you using gadts, functional dependencies, or type families?

In the end I think that type systems will eventually be able to get you a completely typed experience that feels exactly like a completely dynamic experience, but with compile and design time checks that always make you feel good about the experience. However, I don't think we are quite there yet and I don't think you can expect everyone to be able to take the time to get sufficiently familiar with an arbitrary type system in order to be productive with it.


i would be pleasantly suprised to find an extensional type system in a mainstream language :)


Yeah, you basically make initial implementation easier at the expense of maintenance and debugging, which sounds like a good tradeoff until you think about the relative amount of time you spend doing one thing vs the other.


> at the expense of maintenance and debugging,

Do you realize how insane that sounds? Static typing makes programs harder to debug? Harder to maintain??

On the contrary, static typing helps debugging and maintenance: changes that break invariants are more likely to be caught by the type system.

This speak of tradeoff sounds wise on the surface, but this is hardly a tradeoff at all. For many people (including me), a good static type system makes prototyping and maintenance easier.


It probably sounds insane because you have interpreted my post to mean the exact opposite of what I intended.


Your comment sounded like this to me:

> Yeah, you [DarkKomunalec] basically [use static typing to] make initial implementation easier at the expense of maintenance and debugging

It was not clear that "Yeah" was an approval (and not a dismissal), and it was not obvious that "you" was a general "you" (and not a personal "you" directed at DarkKomunalec).

Nevertheless, you were still talking about a tradeoff, and I personally see none: in my experience, dynamic typing makes initial implementations harder (or longer), because so many errors are caught too late. Static type systems have a much tighter feedback loop.


Many, many people have a very different amount of time spent doing each that what you imagine. There's a lot of scripting code that is written once and then never maintained.


If your claim is "Python is great for toy scripts" then fine, I don't have any issue with that.


But you are a dev. These people are not. Code is not their product.


A programmer sees 2 and 2.5 as different types.

A normal person doesn't care. He just wants, and expects, 2+2.5 to yield 4.5. He doesn't want to use a cast, or write the 2 as 2.0, or use some sort of baroque type conversion procedure, or anything like that.

This answer is not Python-specific, of course, but it's a good example of the overhead that gets introduced when a language becomes too type-happy.


> A programmer sees 2 and 2.5 as different types.

So does anyone who's done math. They also know 3/5ths is different. It's not unreasonable to ask for addition to be defined in a reasonable way though.

> This answer is not Python-specific, of course, but it's a good example of the overhead that gets introduced when a language becomes too type-happy.

Besdies OCaml, who actually does this for general programming? I can't think of many examples at all.

P.S., "A normal person doesn't care. He just wants". Stop this. The community here might give you a pass for being tedious and correct. Being tedious and incorrect is pretty much unforgivable.


"So does anyone who's done math."

Hmmm... I could have had an undergrad math degree in addition to the CS degree if I'd stuck around one more semester, but decided to head off for CS grad school instead. And yeah, I understand cardinality, and why there are more real numbers than integers (and could even write the proof for you from memory).

I also completely understand that 2.5 in a computer isn't actually represented as a non-integral real number, or anything like it. The computers we have now can't represent arbitrary real numbers (quantum computers can, I think, but I haven't studied those in any great degree). At one time I even wrote some non-trivial asm programs that ran on the 80x86 FPU, but I'd have to do a fair amount of review before doing that again.

So yeah, I'd say I've both "done some math" and have a good handle on how integers and floats are represented in a computer.

That still doesn't mean I want to have to put in a freakin' type cast when I add 2 and 2.5 on a pocket calculator. Nor does anyone else.


Answer my question. I'm not going to defend a non-existent problem.

Or is this about Pascal again? Did ocaml bite you and you still have a mark? I'm trying to give you an opportunity to suggest this isn't a straw man. My most charitable hypothesis is that you really don't know much about modern static and strong typing techniques.

Everyone's numeric tower accounts for this and does sensible (if not optimal) type conversions. The best of the bunch give you fine grained control on what happens when. That something must happen is inescapable.


I'll bet he starts to care if floating point arithmetic introduces errors into his results. You can only push off the complexity for so long if you want to do things that aren't trivial.


Can you think of a (non-contrived) example where automatic promotion to float is going to cause a non-trivial error when computing (say) a household budget?

"You can only push off the complexity for so long if you want to do things that aren't trivial."

There are a lot of things that aren't "trivial" that nonetheless don't require a totalitarian type system.


> Can you think of a (non-contrived) example where automatic promotion to float is going to cause a non-trivial error when computing (say) a household budget?

Having your share of holiday costs come out as NaN is fiddlier than getting an exception at the point where you actually divided by zero.


Both of those are runtime errors, though.

The "type safe" guys like to pretend that their approach can catch all that stuff at compile time. It does catch a certain class of error, but at the cost of making the code take much longer to write. That doesn't work in a world where your competitor is iterating five times while you're still building the first one. Excellent way to get your milkshake drunk, that.


> Both of those are runtime errors, though.

Sure, the point is that using integer arithmetic for integer calculations gets you better error reporting that saves you time when tracking odwn other bugs.

> The "type safe" guys like to pretend that their approach can catch all that stuff at compile time. It does catch a certain class of error, but at the cost of making the code take much longer to write. That doesn't work in a world where your competitor is iterating five times while you're still building the first one.

My experience is that I can iterate a lot faster if the compiler's able to help me catch errors faster. It doesn't slow down writing the code; almost anything I'd write in say Python translates directly into the same Scala. (I guess the language pushes me into defining my data structures a little more explicitly, but that's something I'd want to do in Python anyway if only for documentation reasons).


Define "much longer to write." I don't think that claim is true.


Isn't the biggest non-programmer audience who has any interest in writing Python scripts scientists? I don't think it's totally contrived to imagine that floating-point precision is an issue in such cases.


Fun fact: half of the professional programmers I met in my life don't even try to do anything for floating point precision issues.

You are living in your bubble. The bubble of people who knows what they are doing.

Get out, you'll be surprise how much amateurish the world is.

Yet it runs.


I'm really not a fan of this argument. No one is arguing for a banishment of the concept of a competency PMF. We're just saying, "If you use these newer techniques and tools and patterns, you get more bang for your buck."

The common response is, "But then I have to learn something new." But this is the curse of technology, and pretty much inevitable. Some learning needs to occur because tech changes so fast.


"If you use these newer techniques and tools and patterns, you get more bang for your buck."

But you don't, necessarily. Dealing with type wankery takes time. And no, it has nothing to do with "learning something new". Languages that tout "type safety" have been around since at least Pascal (47 years old)... arguably even before that, with some of the Algol variants.

Yet they've never made it to mainstream acceptance. It's not even about hype -- Pascal was pushed HARD, by just about every university. Where is it now? Turbo Pascal had a small degree of success, but that's only because it chucked a lot of real Pascal's rigidity out the window.


So... Can you clarify this? "Pascal is no longer popular. Pascal had a static type system. Therefore, static typing has failed?"

If so, I counter: the last 3 years have been a series of breakthroughs both in terms of technology and social acceptance of typed programming. TypeScript is the rapidly growing language, Haskell's never been more boring to use, Scala's edging out Clojure even though it has very fragmented community leadership. C++ has adopted a lot of powerful new features and you're seeing functional programmers speaking at C++ conferences because the folks there can use it. Java has more capable and composable lambdas than Python.

Systems not using these techniques are plagued by security flaws, while those that are work on stabilizing performance under various workloads.

It's never been a better time to be using a strong, static type system..


"Can you clarify this? "Pascal is no longer popular. Pascal had a static type system. Therefore, static typing has failed?""

I would make it "Pascal was never popular", but yes.

"If so, I counter: the last 3 years have been a series of breakthroughs both in terms of technology and social acceptance of typed programming."

This isn't the first rodeo for many of us. "Compile-type static type checking will solve all of our problems" is an idea that's come around repeatedly. Outside of a few niche applications, it never works, or even catches.

As for the supposed booming popularity of TypesScript... dude, TypeScript doesn't even make the top 30 on GitHub. It's less popular than assembly language and Visual Basic.


> I would make it "Pascal was never popular", but yes.

Then why... why bring it up? Should I discount all of dynamic typing because Io and Pike never took off? C++ did stick around, Oak became Java. APL is still in active use.

> This isn't the first rodeo for many of us. "Compile-type static type checking will solve all of our problems" is an idea that's come around repeatedly. Outside of a few niche applications, it never works, or even catches.

"It never works" is a pretty bold claim given that the majority of code you interact with on a daily basis has SOME sort of type system. I'd say C++ is better endowed than most in this dimension.

> As for the supposed booming popularity of TypesScript... dude, TypeScript doesn't even make the top 30 on GitHub. It's less popular than assembly language and Visual Basic.

My dude it would be extremely suspicious if it did. Instead, look at the growth stats: https://octoverse.github.com/. It's the fastest growing language that has non-trivial presence on github (and of course, that's the correct way to phrase it, a new language appearing can appear to have quadruple-digit growth percentile).

This seems profoundly disingenuous. Is that your intent?


Have you ever heard of an obscure language called Java?


Yeah, I have. Java is notorious for tossing 30 page runtime exceptions all over the place. Given that the alleged advantage of getting type happy is to prevent runtime errors, can you explain how Java actually supports your case?


Are you mad at the existence of stack traces? Would you prefer it if the errors were unsourced? Are we pretending Python does it better or even that differently?

As for "the case", Java does reduce the # of NPEs you get by not letting you deref invalid keys on objects, and it makes it easier to handle objects in the abstract.


Well, for one, the post I was responding to claimed that no language touting type safety had ever caught on, and yet there is Java, likely the most used programming language on Earth next to C and C++ (which themselves offer some level of type safety).

But moving on to your claim in this post, nobody ever said "compile-time checks eliminate errors altogether." What they do do is reduce errors and completely eliminate certain classes of errors. They also make maintenance and debugging much easier because they define clear expectations for the interfaces of their arguments and return values. The length of stack traces is a completely orthogonal concern.


Yes, everything looks like it works, but occasionally it's completely wrong. I don't think just chugging along is desirable in all circumstances. But even if I accept your premise, that just says to me that Python is a good choice for people who don't really know how to program and don't care to learn too much.


Floating point precision isn't even a problem for most scientists, at least not the ones dealing with real-world data.

It's pretty rare to have measurements that are accurate to more than a few decimal places.


Yes, we had non-trivial floating point errors in Level that appeared after multiple divide operations in an early version of the product. We stopped using it.


I think that would depend on the type system. Dart has an optional type system (no types, some types, strong mode) with different levels of safety. Interestingly though, even if you write strong mode code, the types are completely thrown out before execution. It's a bit of an aside, but I don't know (libs aside) why anyone would choose python over dart.

Types are great for large projects, but tend to add verbosity and development time for small scripts (thus why there are so few strongly typed scripting languages). SML/ocaml show that there is a nice middle ground where most types are inferred, so you can keep your types without too much work. Unfortunately, they've been around for decades with little usage in the profession.


Hey, Lisp is pretty good for geography and was the language of choice for writing quick extensions in Autocad back in the day. Lisp lists are as natural for representing 2D and 3D points as Python lists and dicts.

My first programming job was writing map reports (choropleth maps) using AutoLisp.


The geographes I know would never, ever be able to do in lisp what they do in Python. They are not computer minded at all.


This meme about "computer-mindedness" is pseudo-scientific nonsense. It's something I fell hard for as well, because it's a lovely idea. But it's simply not true.


Lisp is not for the enlightened only! Anyone that ever used a scientific calculator can hack around.

In fact Lisp can be much simpler than Python, it is just like using a RPN calculator if you don't want to dare into macros and other advanced stuff.


I am thinking of a function in the Windows 3.0 API that was declared in the header file as returning BOOL, but actually could return TRUE, FALSE, or -1 (which, of course, is true according to C's rules).

Worse, I didn't have the docs -- just the header file. That one cost me a fair amount of head scratching before I figured it out.


Python 3.5+ has type hints now (and a standalone typechecker). It doesn't do runtime checking by default, but you can use a 3rd party library to enforce these things if you are concerned about type safety (at the boundaries of your programs for checking inputs for example).

More info on this in the pep: https://www.python.org/dev/peps/pep-0484/


Read again, I got it covered.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: