Hacker Newsnew | past | comments | ask | show | jobs | submit | maxhallinan's commentslogin

Ohio, Kentucky's neighbor to the north, also has a rich archaeological record. The Ohio Archaeological Society's journal is archived here: https://kb.osu.edu/handle/1811/55832 Fascinating read for anyone interested in this topic.


You might be interested in StarLogo. [1] It's a Logo dialect designed for simulating decentralized systems and emergent behavior.

1. https://en.wikipedia.org/wiki/StarLogo


How do you find processes that can be automated? I've often thought that there must be a ton of this stuff in various industries where programmers aren't typically embedded.


Very common when you do tech work in a non-tech industry. Comes up a lot with coding-inclined mechanical engineers. Zed Shaw's words are really true:

> Programming as a profession is only moderately interesting. It can be a good job, but you could make about the same money and be happier running a fast food joint. You're much better off using code as your secret weapon in another profession. People who can code in the world of technology companies are a dime a dozen and get no respect. People who can code in biology, medicine, government, sociology, physics, history, and mathematics are respected and can do amazing things to advance those disciplines.

https://learnpythonthehardway.org/python3/advice.html


Man this is me. Knowing a bit of coding and machine learning in engineering has been such a boon over my career in civil/environmental engineering.

But it's like math, if you know it well enough, you'll find ways to use it everywhere. If you don't, you won't. You have to be the type of person that likes to innovate. Its hard to sell to prospective employers, but its great for demonstrating value once you are with an organization. All of my previous employers fight over trying to get me back when I've found myself looking for work. ...Now I work for myself and make my own work and I've priced myself out of their offers, but that's not so bad.


How much machine learning have you been able to pick up and did you learn formally (in school) or just on your own? It's a broad subject, so where would you recommend one begin, assuming I have a decent undergrad math background? Thanks!


Not the parent, but I did an undergrad maths/physics degree some time back and found https://www.coursera.org/learn/machine-learning to be good as an introduction, unfortunately a new job [unrelated] has prevented my finishing the course but I hope to pick it up again later in the Winter.

I would be interested in thoughts from anyone with ML experience who has reviewed said course's materials?


I've always had way too much math under my belt which helps a lot and have taught myself a lot of genuine computer science out of personal interest. I actually did Andrew Ng's Coursera machine learning class all the way through as a first introduction to that field before realizing it wasn't so mysterious and was just the application of a lot of math I already knew, then ran through a bunch of tensor flow tutorials when that first came out and the like. Then just experimented on my own. I have a knack for data though.

Formally from school, I've only had 3 semesters of scientific programming in Fortran and a shitload of math. That and years and years of building models and massaging data in Excel.

Mostly I'm just really used to learning a new API/tool and applying it to new things.

A lot of the ML stuff hasn't been fancy ML, just basic things but applied in really clever and novel ways.


Study stats and convex optimization. If you understand logistic regression and MLE, you're mostly there.


Great quote -- surprised I've never seen it before!

In my limited experience, it's a mixed bag.

Good: you get special treatment/opportunities because of a unique skill set and increased visibility on the end results of what you do.

Bad: management doesn't really know what you do between software releases, you're paid the going rate for your industry while SWEs make far more, and in-house software quality standards might not be established/followed.


Side ish bonus: you're basically prepared to run a one man show or move into niche consulting (much much higher rates). You also get to set the quality standards/procedures going forward, which can be satisfying.

As a tangential bonus, I've accidentally converted my PhD research coworkers to strict git/markdown thanks mostly to typora (windows application). I showed one person what my work flow and version history look like for some internal documentation and now they do the same and convert to word/pdf as a last step. As far as I can tell, no one outside of the math/cs intersection has the patience for latex.

Source: in that boat.

Edit: In regards to market rates, that can be alleviated somewhat in follow up negotiations (6-12 months in or so). It's hard to convince someone what you're worth / what your value proposition is when they're not used to hiring software people. You need to demonstrate your business effect first, since they typically don't have a clear picture of it.


I disagree with your edit. Pay is categorically higher in tech than not for engineers.


Sounds like a pain in the ass.

It's already a huge pain dealing with know-nothings inside of tech firms. Now imagine having to cater to know-less-than-nothings somewhere else.


strongly disagree

1. you _won't_ be happier running a fast food joint - the work is gruelling, and it's so easy to go bust. And you won't bring home six figures

2a. people who _actually_ can code are scarce, even in the tech industry. Source: conducted over 300 interviews

2b. quite logically, your coding skill will be most appreciated and compensated at a Big Tech Co, not at a government department where they will be simply unable to see the difference


>2a. people who _actually_ can code are scarce, even in the tech industry. Source: conducted over 300 interviews

As a young person with an interest in software programming (currently studying chemical engineering but still write C code now and then), what do you look for when trying to find out people who can _actually_ code?


Literally the ability to write working code. You'd be amazed how few interviewees can even put together a working for... loop.


This absolutely blows my mind. I have a hard time even believing it. But having never conducted interviews, I just don't know.


I've heard this story before, but it sounds absolutely insane and I can't begin to imagine (let alone expect!) that such a thing occurs. Is it really true?


yes, exactly this


a somewhat serious answer: simple things should be easy to you, and hard things possible

examples of simple things: DFS/BFS walks; simple Project Euler problems; or "write a simple game in terminal, maybe with some form of minimax search" (a bit harder), or maybe a parser for simple arithmetic expressions


Same money running a fast food joint? You're underpaid bro or you know some crazy overpaid fast food managers. People from science based fields are coming to computer science because those other fields lack job opportunities at competitive pay.


I believe OP means running a fast food joint as the owner, not a manager employee.

Granted I don't know anything about that business, a quick google search comes up with an article from 2015 saying running a McDonald's provides an average annual profit of $150k so it sounds about in the right ballpark range: https://www.mymoneyblog.com/mcdonalds-franchise-cost-vs-prof...


McDonald’s franchise owners make $500k to $1 million a year, in average. That’s profit, not revenue.

My source is from the McDonald’s franchise disclosure documents. The money blog mentioned in sibling comments claim it’s less.


According to this random quora link: https://www.quora.com/How-much-does-McDonalds-make-in-a-day the average McDonald's unit in the USA has $2,670,320 in annual sales. 25% - 40% profit margin sounds really high for a restaurant, but I don't know enough about the industry to dispute it. Are you sure those numbers were presented as averages and not the high end of what you could make? Or does the typical owner have multiple locations?


My source was this pdf: https://www.bluemaumau.org/sites/default/files/MCD%202013%20...

Very possible I skimmed and may have read it incorrectly, accounting is not my thing. As I have now reached the maximum amount of effort I'm willing to put into a forum comment, I'm not going to dig any further. But if you can tease out better information, I'd be curious to know.


From the document... Across ~12,000 locations they put the majority of restaurants in a range but they do have the numbers pretty well crunched.

Average profit margin 26-28% Average gross sales 2.2 - 2.6M Average operating income before rent/tax: 570k - 716k

"The rent paid to McDonald’s will vary based upon sales and McDonald’s investment in land, site improvements, and building costs."

It looks like that rent paid to McD's home planet is somewhere in the neighborhood of 10% of that investment (yearly? I guess?) but it seems to average about 100k-150k.


Good, this was my takeaway as well. My range was loosely rounded since 716k was something like the 85th percentile.


I assuming running = owning, not managing. Subtle maybe in practice but not in revenue.


I don't buy it.

There are tons of software "engineers" who try to escape fintech for FAANGs or even startups, coming from top tier firms like Goldman and McKinsey.

Likewise, bioinformatics generally pays a lot less than traditional SWE roles.

So what I have seen is the exact opposite in practice: people with minimum coding skills in other fields trying to jump into tech firms.


I suppose it's hard to match the salaries at The Big Corps working in other industries. Still, outside of software companies, people think I'm basically a wizard for doing simple things, like automatically generating climographs from JSON files. I just don't get that kind of positive reinforcement from software people. I think it's just more enjoyable to be the hacker rather than a hacker.


That's great, but I prefer to work with people who can teach me new tech knowledge.


Any midsize company has tremendous amount of things that could be automated. For example, I work at manufacturing company, we produce parts for bigger companies. Bigger companies already have APIs, web portal, etc. We have people who often manually enter data.

Any time, a piece of paper is passed around or when people enter data manually you can either automate or improve the process. People are prone to errors.

Right now, I am rewriting a mission critical application, some trivial changes will save hours a week and improve the integrity of the information. It is not as exciting as writing algorithms but it is nice to see an application written by you used by 200+ people.


Go on forums where people who use a platform/API/SaaS post implicit feature requests

Off the top of my mind:

https://sellercentral.amazon.com/forums/c/selling-on-amazon/... https://community.ebay.com/t5/Tools-Apps/bd-p/tools-apps-db


MWS is not fun. Trust me. Haha.


Honestly your best bet is to talk to people. I doubt cold calling/knocking on the door of a business will succeed, but people get enthusiastic when it comes to complaining about tedious/monotonous work. Especially at bars. Turn to someone nearby and say "I am looking for ideas to yadda yadda save people time and frustration. Is there any tedious process you deal with at work which you think doesn't need a person to deal with?" If you travel, this is especially common in airports. I've never asked someone this question and yet people rant at me all the time once I say "software developer".


You really need to find a domain expert who can at least get you started. I'm working on a problem that I would not have even known existed, if not for running across a couple of engineers who had a business in the field and saw an opening that only a small company would care about (market isn't big enough for the large players).

I know similar situations have to be all around us. The problem, as you say, is finding out about them.


At the moment many corporates are just automating the incoming invoice process. However, many processes are document (any kind of digital file) based to share information between departments, vendors or customers. Many processes could be automated. To identify a business case for automation worth coding such an application or offer an API we use three main KPI to identify processe worth automating it:

- more than 10 documents per day on year average, e.g a bank will receive new annual reports only in some calendar months but in massive scale

- average number of pages or lines of text per document, the longer the document the more mistakes will be made by humans, as they don't have the time to read everything in detail

- average pay of the FTE who is able to understand and process the document manually should be higher than the average pay of all employees in the company, to make sure the documents encapsulate business value

It's not a fixed set of KPI but helps us to sort out too narrow use cases.

By we I refer to the team behind my startup Konfuzio:

http://www.konfuzio.com/en/


Thanks a lot for sharing such valuable insights!

Small payback : on your English home page under "Operation of the software" > "Information security", there's one too many sentences:

> We set the highest standards both when creating the software and when processing your data. Both when creating the software and when processing your data we set the highest standards.

(First one is better imho).


What's the link?


https://aspietests.org/raads/index.php

Here you go. Good luck, it's 80 questions. (My pet theory is that going through all those questions is already an answer in itself)


I like statically typed functional languages but I'm interested in understanding why people dislike them.


I write Haskell currently, but have used Clojure in the past. I find Haskell to be superior (static typing being one reason) however it takes more persistence and effort in the beginning to get used to Haskell.

I'd recommend that people start with something "easy" like Elm, and then transition to Haskell via frameworks like Miso or Reflex. That's exactly what I did.

This project of mine used to be in Elm & Go, but now uses Haskell throughout: https://github.com/srid/slownews


  Location: Munich, Germany
  Remote: Yes (remote only)
  Willing to relocate: n/a
  Technologies: JavaScript, TypeScript, Elm, Haskell
  Email: maxhallinan <at> gmail.com
I'm looking for small web development contracts (1 - 40 hours). Resume on request.


Where does this community of hobbyist language designers gather online?


/r/programminglanguages (17.5k subscribers) is a good start. They recently launched a directory of projects (50+) the group is working on: https://www.proglangdesign.net which also links to their IRC and discord servers.

Also there's https://futureofcoding.org which has a growing community of language and tool designers focused on live-coding languages. There's a good number of projects being developed by members there.


Thanks!


How does the Aesthetic Usability Effect account for the success of websites like Amazon, Wikipedia, Craigslist, Hacker News, classic Reddit, and LinkedIn, to name a few? There seems to be no correlation between aesthetic quality and use of these sites. I'm not sure what that indicates about the "usability" of these sites. If the goal is to maximize the number of users, it would seem that there is some evidence that a sense of usability determined by aesthetic quality is not that important.


I know this will never happen, but it would be interesting to see the percentage of Reddit users who took the time to opt out of the new "improved" interface. I suspect that number is small, because most people have learned to just quietly sigh and accept random UI changes, but larger than the Reddit UI team would care to admit.


If you are a moderator of a reasonably popular subreddit, you can find out by checking the traffic statistics which breaks out old vs new! For example, for my little link-sharing subreddit (https://www.reddit.com/r/gwern/) : https://imgur.com/a/sUcEQeQ

You can see a large fraction of the users have taken the time to optout, perhaps as much as a third (!) of those who can. Considering how passive most users are, this tells me that the new & expensive Reddit design is indeed loathed.


Reddit tries pretty hard to force you back onto the new design, too. When not using old.reddit.com it "forgets" that I opted out of the redesign on a regular basis, and even old.reddit.com occasionally redirects to the new design.


I think that might be a bug. That was happening to me on a daily or more frequent basis for a while, but I don't think I've had to redo the opt-out in like a week.


Thanks! 1/3 opting out is indeed damning. I'm surprised they made those stats public. Hopefully a moderator of /r/programming, /r/news, /r/politics, or some other mega-popular subreddit will post their stats.


People aren't using those sites because of their UIs, but because of the content they offer. UI isn't the only factor, but it is a necessary factor.


> UI isn't the only factor, but it is a necessary factor.

Necessary for what (or for whom)?


A UI is necessary for users to use your product. That doesn't mean the UI is your product.


> A developer who knows how to code and hack may not have the skill/design knowledge to properly create a language/DSL, perhaps making a monster in the process. A well trained computer scientist (academic, self taught, doesn't matter) should have those tools and more importantly the design know how.

The author of Beautiful Racket, Matthew Butterick, is a lawyer and a typographer. He is not a computer scientist. And yet, he designed a DSL called Pollen for creating web-based books. Pollen has been quite successful within the Racket community.

My understanding is that Racket is predicated on the idea that people like Matthew, as much as people like your computer scientists, are the best authors of DSLs. They are the ones who understand the domains they're working in. If the division you predict ever does exist, it is because the tools have failed to make people who are not computer scientists capable of creating the DSLs they need to solve their problems. It's not because languages are inherently better designed by experts in the domain of programming languages.


Why wouldn't the author qualify as a self taught computer scientist though? To me your point more of a statement on the accessibility of computer science first, and I do completely agree that the accessibility is important, which the Racket/HtDP ecosystem does pretty well with.

That said, there's many ways to write code and learn to code, and I think of the web programming bootcamp style or the cookie cutter college grads who go through four years learning how to program in X language to work at fancy company Y. My point is that many routes are just not focusing on the higher level design skills that I think are needed to make good libraries/frameworks/DSL's.

To clarify, I'm not saying you have to be a PL expert, simply good at program and language design, which I think is what lacks in many places and could create such a division. That skill is/should be accessible to everyone.


Ok, I think I read more into your first comment than you actually said.

>many routes are just not focusing on the higher level design skills that I think are needed to make good libraries/frameworks/DSL's.

I have observed that too. But I don't think this is about who is and isn't a computer scientist, whether self-taught or formally trained. I think it's more a change in the way people relate to programming languages. Perhaps programming languages were commonly assumed to be principally an academic topic. Perhaps it's not that more people are becoming computer scientists but that more people are finding non-academic ways to relate to programming language design. I think what Butterick did was to build the tool he needed to do a job (write a book). So language design becomes just like any other form of hacking.

>Why wouldn't the author qualify as a self taught computer scientist though?

Butterick himself is adamant that he is a lay person. He compares himself to a "squirrel in a ferrari": https://www.youtube.com/watch?v=IMz09jYOgoc And that's his point in that talk - Racket makes it possible for even the lay person to build the language they need.


He can view himself however he wants, but the man just wrote an article that competently covers Turing completeness, regular expressions, and Lindemeyer trees, among other things! He's definitely earned his comp sci merit badge, so to speak.


Computer science is a specific treatment of these topics that is based in formalism. Compare John McCarthy's "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I" and Paul Graham's "The Roots of Lisp". These papers cover exactly the same material. But only the first is computer science because it uses a formal language to express the ideas.


If that's the definition of CS we're using then my original post is a very egregious misnomer. I feel like that definition is way too restrictive though. CS can be formalized and informal, but both are still CS IMO.


I never understood why CS people are so afraid of non-CS people touching language development. They hide the keys as much as possible so nobody but the predestinates can touch it. Racket gives that key. Despite that few people will use it (probably only the most prepared) and if these languages hacks are completely non-sense they'll fail. The world will not suffer much with this process.


If anything "CS people" are less afraid than they should be, given the demonstrated popularity of horrible retrograde languages designed by people who didn't know what they were doing.


I thought this was one of the notably bad talks this year. The whole premise that a function of Maybe a should be a function of a without an API change is neither intuitive to me nor really justified by Hickey. Different things are different. It's sad to see someone build such a wall around himself when faced by something (type theory) that he doesn't understand.


>that a function of Maybe a should be a function of a without an API change is neither intuitive to me nor really justified by Hickey

He spent many minutes motivating it. If you support some functionality to a client (as a library, or a service, or more abstractly), and then later want to relax the constraints that you require from those clients, this should be an acceptable change that they shouldn't have to worry about / be broken by. Like if all of a sudden some REST api no longer requires a specific HTTP header to be present in requests, then this change shouldn't break clients that have been including that formerly-required header all along.

Similarly, if you provide something to clients, and you add functionality to provide a response in all circumstances, not just some, then this should not break clients either.

This clearly is not true of `Maybe` / `Option[T]` and I think it's a pretty fair critique. Maybe we should be using Union types in these situations instead.


His argument for spurious API breakage is strictly logically correct, but seems practically dubious to me. When have you ever had to unnecessarily break API compatibility because something you thought was an Option[T] result turned out to be really a T, always? Maybe I'm wrong and someone will post some convincing examples, but I currently don't see this as a problem that needs solving.

Union Types don't compose straightforwardly because they flatten; maybe this is desirable in some cases but the rationale in Hickey's talk leaves me unconvinced. The only practical use case for union types I'm aware of is smoother interfacing to dynamically typed APIs; any nice examples for something beyond that?


> When have you ever had to unnecessarily break API compatibility because something you thought was an Option[T] result turned out to be really a T, always?

That would be a breaking change. And should be, if you're into that sort of thing.

The objection is to the opposite case: What was a T is now an Option[T]. I don't know Scala specifically, but that's a breaking change in every typechecked language I know. Rich is arguing that it shouldn't be. But it could be possible even in typed languages through union types. For example, you can do this in TypeScript by changing T to T | undefined, which is a superset of T.


Nope, it's not the opposite case, I was just to lazy to spell it out. Which way around it is depends on whether it's a return value or parameter. Covariant vs contravariant. If it's a parameter an API change from T to Option[T] shouldn't break (you require less), whereas with a return type it's from Option[T] to T (you promise more).


To be fair the "result"-part in "something you thought was an Option[T] result turned out to be really a T" makes it sound like you were speaking of the return-type to me as well. I appreciate the elboration though!


Yes, that was my fault, I overlooked "result," and I also appreciate the clarification.

I think the point remains that, while this is not a breaking change from a contractual viewpoint, most type systems would deem it incompatible.


I think you're kind of both agreeing. It would be nice if Maybe did work like he suggested, but in practice it's not that big a deal.


The sad thing is that Rich Hickey had some very good videos when Clojure was a new thing back in 2008–2009. Unfortunately, I've disagreed vehemently with most of his talks since then. In this case, it's completely illogical that a function `Maybe a -> b` should be callable as if it were a function `a -> b`. Do you want to know how I know? Because it would be just as illogical to allow a function `Vec a -> b` to be called as `a -> b`. And Rich must agree because Clojure itself does not support that!

I've learned that videos of his talks are just not worth my time.


Why is it illogical to say that a Maybe a -> b should be callable as if it were a -> b?

His point is that Maybe a should be composed of all the values of a, plus one more value, nil. A value of type a is a member of the set (nil + a). Why should having a more specific value reduce the things you can do with it? It breaks composition, fundamentally. It's like saying (+) works on integers, but not on 3. I'm saying this someone who really enjoys type systems, including haskell.


> His point is that Maybe a should be composed of all the values of a, plus one more value, nil

No, that's a simple union type. There are very good reasons for Maybe to be different than unions (Maybe can nest meaningfully, simple unions can't.)

Maybe Maybe a is valid, and often useful, type.

Of course, if you have a function of type a -> b and find out you need a more general Maybe a -> b, instead of a breaking change, you just write a wrapper function that produces the correct result for Nothing and delegates to the existing function for Some(a) and you're done without breaking existing clients.

(Now, I suppose, if you're u had something like Scala implicits available, having an implicit a -> Maybe a conversion might sometimes be useful, though it does make code less clear.)


I agree that there are reasons for Maybe a to be a different type from (a | nil) but there are also good reasons to prefer (a | nil). Like most things, it's a set of tradeoffs. What I appreciated about this talk was that he went into the benefits of thinking about types in this way. It's (relatively) common to see the benefits of Maybe a explained, but more rare to see the benefits of full union types explained.


My problem is that I don't know when to expect nil from a call because in Java null is part of every type and you can happily receive a null from anything, the compiler won't give you a warning. In OCaml I know what to expect because Some x | None is super simple to reason about. I can never receive a null a nil or other things that somehow satisfy the type requirements. Clojure is great with untyped programming everything is an Object after all but I still would like to see a reasonable thing like Erlang's {:ok,x} | {:error, error} or OCaml's Some x | None. It is not an accident that many languages that like reliability implemented it like that.


Yes, the default of a lot of languages, (Java, C, etc) where nil is implicitly a member of every other type is a bad default. But that's a separate question.


> Why is it illogical to say that a Maybe a -> b should be callable as if it were a -> b?

Fundamentally because it would require you to conjure up a value of type b from nowhere when the Maybe a is Nothing. If we view the function type as implication this would not be valid logically without some way of introducing that value of type b.

You could imagine some function from Nothing -> b that could rescue us. But since it only handles one case of the Maybe type, it is partial (meaning it could give undefined as an answer). There is basically two total functions that we could change it to:

   * Maybe a -> b in which case we are back where we started.
   *  Unit -> b which essentially is just b, which can be summed up as meaning we need some kind of default value to be available at all times.
So to be able to call Maybe a -> b as a -> b you would need some default value available at all the call sites for a -> b

Now this is only "illogical" because we don't postulate a value of type b to be used as this default.

> It's like saying (+) works on integers, but not on 3

No, it's like saying (+) must work on all integers AND a special value nil that is not like any other integers, but somehow included in them and all other data types. We can't do anything with this nil value since it doesn't carry any data, so in the case of (+) we would essentially have to treat it as an identity element.

This is good though, since (+) has 0 as an identity element, so we can just treat nil as a 0 when we encounter (+). However, when we want to define multiplication we still need to treat nil as an identity element (since it still doesnt carry any data), except the identity element for multiplication is 1. This would be repeated for every new function that deals with integers.

So by mashing together Maybe and Integer we have managed to get a frankenstein data type with an extra element nil which sometimes means 0 and sometimes means 1.

Why not just decompose them into Maybe and Integer and supply the default argument with a simple convertion function like fromMaybe?

(FWIW, I actually agree with Hickey that using Maybe in api design is problematic and I've encountered what he's talking about. But while that might be an argument for where he wants to take Clojure, it's not an argument for dismissing type theory the way he does.)


You got it backwards. These problems arise when you want to use an (a -> b) function as (Maybe a -> b), not vice versa.


Yeah, you're right, I got confused when interpreting the parent comment. Thanks for pointing it out!

I guess I overlooked it because the other way is so logically trivial, since it basically boils down to A -> B => (A || Nothing) -> B, which is just an or-introduction. So if you wanna implement Maybe generically the "work" lies on the other side.

But since Hickey's argument sort of is that we shouldn't implement Maybe generically, I guess my argument here becomes circular. (Begging the question maybe?)


> I guess I overlooked it because the other way is so logically trivial, since it basically boils down

Yeah, that's (part of) Hickey's point. That the "best" type systems fail this test, and require manual programmer work to solve this problem. Again, I'm saying this as someone who really appreciates Haskell.


I think Rich does not like Some x | None because he does not like simple pattern matching too much. This is why Clojure does not have a first class pattern matching syntax (you can emulate, and there is a library and it is just X amount of lines, etc. but still).

In this regard I really like OCaml:

    let get_something = function
      | Some x -> x
      | None   -> raise (Invalid_argument "Option.get")
This is very simple to understand and reason about and very readable. The examples Rich was trying to make in the video I could not tell the same about. He kind of lost me with transducers and the fact that Java interop with Java 8 features is rather poor.


I was surprised it was a seperate library in Clojure and doesn’t seem to be something that gets used much. Puts me off that it’s missing one of the most attaractive features of functional languages.


Because it would be just as illogical to allow a function `Vec a -> b` to be called as `a -> b`. And Rich must agree because Clojure itself does not support that!

Maybe Clojure's standard library just isn't that focused on vectors? Python stdlib doesn't support this as well, but NumPy does.


I agree.

Further his exposition on `Either a b` was built on a lack of understanding of BiFunctors.

The icing on the cake was his description of his own planned type theory. What he described was, as I could decipher from his completely ignorant ravings, a row-based polymorphic type system. However he passes off his insights as novel rather than acknowledging (or leveraging) the decades of research that have gone into type theory to describe the very system he is trying to build.

Worse, he continued to implore his audience to spread FUD about type theory, claiming several times, "it is wrong!"


Well, in his defense, type theory is supposed to make software engineering simpler, not more difficult. So if even he doesn't understand it, then how can we expect a random programmer to?


Well software engineering is the only engineering field that produces half baked unreliable crappy solutions continuously. Other fields cannot afford such attitude towards products. I think a simple (inferred) type system is pretty useful to increase correctness and it helps you to increase reliability (even though it does not avoid all of the mistakes you can make in software).


Are you sure that he doesn't understand it, or is it possible that you haven't worked with the same scale of systems he has?

Here's my response to his talk, which I found insightful:

https://lobste.rs/s/zdvg9y/maybe_not_rich_hickey#c_povjwe

Also, I think your comment suffers from the problem here, where you invoke "type theory" without any elaboration:

https://lobste.rs/s/zdvg9y/maybe_not_rich_hickey#c_ioeyob

Rich has taken the time to explain his thoughts very carefully, and his clear about what his experience is, and domains he is talking about. Whereas I see a lot of vague objections without specifics and that aren't backed up by experience.


You're right: there isn't much substance in my comment. I just meant to qualify the parent comment by saying that not everyone found this to be a "Best talk of 2018". A lot of good arguments for both sides were made in other places and it felt a bit obnoxious to reopen the argument here, where it's off topic. I should have simply stated that this is a controversial talk instead of adding my two cents.

Here's my understanding of one of your points: required fields in a data serialization format place an onerous burden on consumers. So in proto3, every field is optional, and this permits each consumer to define whats required for its own context.

Unfortunately, I can't find any connection between the dilemma at Google and the suitability of the Maybe type. You say this:

>The issue is that the shape/schema of data is an idea that can be reused across multiple contexts, while optional/required is context-specific. They are two separate things conflated by type systems when you use constructs like Maybe.

I agree - the value of a field might be a hard dependency for one consumer and irrelevant to a second consumer. But Maybe has nothing to do with this. If the next version of protobuf adds a Maybe type, it would not obligate consumers to require fields that they treat as optional. It would just be another way to encode the optionality, not optionality as a dependency but optionality of existence. A required input could still be encoded as a Maybe because the system can't guarantee it's existence. So Maybe is simply an encoding for a value that isn't guaranteed to exist. And that's exactly the scenario you described in proto3 - now every field could be encoded as a Maybe.

A second point that stuck out to me:

>I didn’t understand “the map is not the territory” until I had been programming for awhile. Type systems are a map of runtime behavior. They are useful up to that point. Runtime behavior is the territory; it’s how you make something happen in the world, and it’s what you ultimately care about. A lot of the arguments I see below in this thread seemingly forget that.

Your worldview here is very different from my own, and perhaps while this difference exists, there won't be much mutual understanding. I don't find any relationship between types and anything I understand as "runtime behavior". Types are logical propositions. The relationship between programs and types is that programs are proofs of those propositions. Runtime does not enter into the picture. That's why constraint solvers work without running the program.


If that was your intention, simply saying "I didn't find this talk useful" would suffice. It's not necessary to say that "Rich Hickey doesn't understand type theory".

I would say that "X doesn't understand type theory" is becoming a common form of "middlebrow dismissal" [1], which is discouraged on HN.

And that's exactly the scenario you described in proto3 - now every field could be encoded as a Maybe.

No, in protobufs, the presence of fields is checked at runtime, not compile time. So it's closer to a Clojure map (where every field is optional) than a Haskell Maybe.

This is true even though Google is using statically typed languages (C++ and Java). It would be true even if Google were using OCaml or Haskell, because you can't recompile and deploy every binary in your service at once (think dozens or even hundreds of different server binaries/batch jobs, some of which haven't been redeployed in 6-18 months.) This is an extreme case of what Hickey is talking about, but it demonstrates its truth.

I don't find any relationship between types and anything I understand as "runtime behavior".

Look up the concept of "type erasure", which occurs in many languages, including Haskell as I understand it. Or to be more concrete, compare how C++ does downcasting with how Java does it. Or look at how Java implements generics / parameterized types.

[1] http://www.byrnehobart.com/blog/why-are-middlebrow-dismissal...


I haven't seen any of Rich's contributions to any major open source Haskell project. I can't really speak to his experiences with Haskell in proprietary code bases. Has he been a prolific Haskell contributor/hacker/user?

From his talk he hasn't convinced me that he understands Haskell's type system. Not only does he misunderstand the contract given by `Maybe a` but he conflates `Either a b` with logical disjunction which is definitely a false pretense. He builds the rest of his specious talk on these ideas and declares, "it [type theory] is wrong!"

He goes on to describe, in a haphazard and ignorant way, half of a type theory. As I understood it these additions to "spec" are basically a row-based polymorphic type system. Why does he refuse to acknowledge or leverage the decades of research on these type systems? Is he a language designer or a hack?

I can't even tell to be honest. He has some good ideas but I think this was one of his worst talks.


I think this comment is a bit too harsh, and I would rather we just discuss the points he makes, not his credibility. The man has decades of experience in software system development and architecture, and has built one of the most popular programming languages in the world, as well as Datomic. If that hasn't given him the right to give his opinion during the keynote of a conference for the language he built, then I don't know how much you want from him.

To apply your own standards, have you been a prolific Clojure contributor/hacker/user? Have you contributed to any major open source Clojure projects?

Can you actually say what about Maybe/Either he got wrong because it seems like he understands it perfectly well, speaking as a Scala developer.


> I think this comment is a bit too harsh, and I would rather we just discuss the points he makes, not his credibility.

I was trying to address the points he made but parent appealed to his authority which I haven't found convincing.

> To apply your own standards, have you been a prolific Clojure contributor/hacker/user? Have you contributed to any major open source Clojure projects?

I have been a user on a commercial project. It's a fine enough language. I wouldn't call myself an expert. And I haven't given a keynote address where I call out Clojure for getting things I don't understand completely wrong.

> Can you actually say what about Maybe/Either he got wrong because it seems like he understands it perfectly well, speaking as a Scala developer.

His point about Maybe was misguided at best. The function `a -> Maybe a` is a different function than `a -> a`. Despite his intuition that the latter provides a stronger guarantee and shouldn't break code, callers may be expecting the Functor instance that the `Maybe` provides and therefore is a breaking change.

His point about `Either a b` was perhaps further from the mark. It is not a data type that represent logical disjunction. That's what the logical connective, disjunction, is for. Haskell doesn't have union types to my knowledge. Either is not a connective. It's a BiFunctor. His point that it's not "associative" or "communtative" or what-have-you simply doesn't make sense. In fact he calls Either "malarky" or, charitably, a "misnomer."

To his credit he says he's not bashing on type systems, only Maybe and Either. But he later contradicts himself in his conclusions. He goes on about reverse and how "the categoric definition of that is almost information free." And then, "you almost always want your return types to be dependent on their argument..." So basically, dependent types? But again, "you wouldn't need some icky category language to talk about that."

So again, I think he has some surface knowledge of type systems but I don't think he understands type systems. I'm only a beginner at this stuff and his errors were difficult to digest. I think if he wanted to put down Maybe and Either he should've come with a loaded weapon.

He's had better talks to be sure! I just don't think this one was very good. And in my estimation was one of the poorer talks this year.


>His point about `Either a b` was perhaps further from the mark. It is not a data type that represent logical disjunction. That's what the logical connective, disjunction, is for. Haskell doesn't have union types to my knowledge. Either is not a connective. It's a BiFunctor. His point that it's not "associative" or "communtative" or what-have-you simply doesn't make sense. In fact he calls Either "malarky" or, charitably, a "misnomer."

I don't agree with Hickey, but isn't there a connection between Either as a basic sum type and logical disjunction via the curry-howard correspondence?

And wouldn't "forall a b. Either a b" be the bifunctor, since it has a product type/product category as it's domain, while the result "Either X Y" (where X and Y are concrete types, not type variables) has the semantics of logical disjunction ie. it represents a type that is of type X or type Y?


Yes, there is a connection. Which makes it all the more strange: Either does correspond as you say which means it has the same properties as the connective. In the correspondence the functor arrow is implication and a pair of functions ‘a -> b’ and ‘b -> a’ is logical equivalence. Using these we can trivially demonstrate the equivalence of Either to the associativity laws using an isomorphism.

That’s what makes his talk strange. He talks about types as sets and seems to expect Either to correspond to set union? If he understood type theory then he’d understand that we use isomorphism and not equality.

You can express type equality in set theory and that is useful and makes sense.

But it doesn’t make sense in his argument.

Malarky? Come on. Doesn’t have associativity? Weird.


> The function `a -> Maybe a` is a different function than `a -> a`. Despite his intuition that the latter provides a stronger guarantee and shouldn't break code, callers may be expecting the Functor instance that the `Maybe` provides and therefore is a breaking change.

I don't really follow that. How can it be a breaking change? Can you give an example?


If you're building a parser, you may use: http://hackage.haskell.org/package/parsec-3.1.13.0/docs/Text...

Where your downstream parsers match on `Nothing` and assume the stream hasn't been consumed in order to try an alternative parser or provide a default.

If you change an equation to use `option` instead you have a completely different parser with different semantics.

I was thinking of a case where I use your function in a combinator that depends on the Functor and Monoid instances provided by the `Maybe` type. If you change your function to return only the `a` and it doesn't provide those instances then you've changed the contract and have broken my code. And I suspect it should be easy to prove the equations are not equivalent.


But protos and Haskell's type system solve different problems.

Perhaps, maybe, public apis like protos shouldn't encode requiredness of any piece of data (I actually fully agree with this).

But that says nothing about whether or not my private/internal method to act on a proto should care.

Another way of putting this is that requiredness ahiuslnt be defined across clients (as in a proto), but defining it within a context makes a lot of sense, and maybe/optional constructs can do that.

Or iow, other peopleay use your schema in interested and unexpected ways. Don't be overly controlling. Your code on the other hand is yours, and controlling it makes more sense. So the arguments that rely on proto-stuff don't really convince me.

(I work at Google with protos that still have required members).


> But protos and Haskell's type system solve different problems.

That's pretty much my point. Hickey is very clear what domains he's talking about, which are similar to the domains that protos are used in -- long-lived, "open world" information systems with many pieces of code operating on the same data.

People saying "Rich Hickey doesn't understand type systems" are missing the point. He's making a specific argument backed up by experience, and they are making a vague one. I don't want to mischaracterize anyone, but often I see nothing more than "always use the strictest types possible because it catches bugs", which is naive.

I agree with your statement about private/internal methods. I would also say that is the "easy" problem. You can change those types whenever you want, so you don't really have to worry about modelling it incorrectly. What Hickey is talking about is situations where you're interfacing with systems you don't control, and you can't upgrade everything at once.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: