Hacker Newsnew | past | comments | ask | show | jobs | submit | Max_Limelihood's commentslogin

Android is open source; MacOS and Windows aren’t. This gives me more control over my computer, especially since this means LineageOS and GrapheneOS for the desktop soon.


GNU/GNU


I've actually used GNU/GNU to refer to HURD:)


Ubuntu replaced their core userland utils with uutils, so the bulk of it. I’m guessing most other distros will follow suit.


Huh. I didn't know Ubuntu had replaced GNU coreutils. I'm not sure that alone counts as "the bulk of it", but it's definitely very significant.


Not jmp?


This is a really good explanation of why I find Julia (effectively a Lisp in terms of these features) to be indispensable. The ability to generate code on the fly makes life so much easier that I just can't live without it.


Yeah I often describe Julia as a Lisp in sheep's clothing.

Or as the m-Lisp promised to us :) I chuckled when I read:

> The way that common Lisp systems produce executable binaries to be used as application deliverables is by literally dumping the contents of memory into a file with a little header to start things back up again.

Which is pretty much of Julia's sys-/pkgimages work. Pkgimages are an incremental variation on this idea.

One of the novelties in Julia is the world-age system and the limits on dynamisim it introduces on eval.


Pity that Apple didn't push Dylan.


I agree that Julia satisfies the first two properties however, it's not clear how it satisfies the third one (homonicity). In particular, how the argument with regard to Python does not apply to Julia as well?


I think this answer https://stackoverflow.com/a/31734725/5141328 by one of Julia's creators fits here. The TLDR is that homoiconicity is a mix of two separate things: how willing a language is to represent itself, and how close the syntax for code is to the syntax for the data-structure representing that code. Julia meets the first of these, but not the second. Whether this matters depends on why you care about homoiconicity. The biggest difference between Julia and Python here is that Julia has syntactic macros and python doesn't (although see https://peps.python.org/pep-0638/)


I just don't understand people who call von neumann style programming languages "lisp-like" or "almost a lisp". I've heard people say this of python and haskell as well, and I just don't see it, at all.


Never have I seen a title so perfectly encapsulate the very problem it's trying to solve.

If you keep trying to "protect" your research from any kind of competition, you're doomed from the start.


Answer: they don’t

(Seriously, I’ve gotten so fed up with Python package management that I just use CondaPkg.jl, which uses Julia’s package manager to take care of Python packages. It is just so much cleaner and easier to use than anything in Python.)


I hate python package management - I really do. But I've never actually had a problem with virtual environments, and I think it's because I just use virtualenv directly (rather than conda or whatever else).

I have these aliases in my .bashrc, and I can't remember the last time I had a major issue.

alias venv='rm -rf ./venv && virtualenv venv && source ./venv/bin/activate'

alias vact='source ./venv/bin/activate'

alias pinstall='source ./venv/bin/activate && pip install . && pip install -r ./requirements.txt && pip install ./test_requirements.txt'

I don't have all the fancy features, like automatically activating the virtualenv when I cd into the directory, but I've always found those to be a bigger headache than they are worth. And if I ever run into some incompatibility or duplicate library or something, I blow away the old venv and start fresh. It's a good excuse to get up and make a cup of tea.


> source ./venv/bin/activate

To this day I'm not quite sure why the venv developers decided that sourcing was a good idea; all it does can be effectively replaced with

    #!/bin/sh
    export VIRTUAL_ENV="path to venv"
    export PATH="$VIRTUAL_ENV/bin:$PATH"
    unset PYTHONHOME
    exec "$SHELL"
Just run this script to get into an "activated" shell. To deactivate, just press Ctrl+D. If you're really fancy, you can replace the last line with

    exec "${@:-$SHELL}"
to run a command directly in the activated environment (and then deactivate it immediately).


This technique is simple, doesn't reinvent the wheel, and even has it's own name: Bernstein chaining.

http://www.catb.org/~esr/writings/taoup/html/ch06s06.html


> virtualenv venv

That would be python2, in 3 it's "python -m venv venv" (first venv is package to run, second is directory to put it in)

Otherwise yeah, it's the same and I also use it manually. Never had any problems.


`virtualenv` still exists and is still actively developed. It's true that Python 3 ships with `venv` but I think `virtualenv` offers some additional features.

https://github.com/pypa/virtualenv


I use virtualenvwrapper[1] and can't remember any problems with virtual environments either. It sets up human readable aliases for you like "mkvirtualenv" to create a virtualenv and "workon" to activate a virtualenv.

[1] https://github.com/python-virtualenvwrapper/virtualenvwrappe...


A few weeks ago I spent about a week debugging my Poetry environment. Turns out, their latest release (which I believe was a patch bump!) brought in some breaking changes. And on top of that, a bunch of stuff was forcing python3.11 under the hood, whereas I was on python3.10.

It was a nightmare.


Poetry seems to break compatibility with every release of either itself or Python, double the fun.


This is similar to what I do, except my "new environment" alias executes a function that takes a python version, installed/specified via pyenv.

Never had a single problem, venv + pyenv is a great combo. As far as I can tell, like so many sources of frustration in tech, the issue typically lies with user error/not fully understanding the tool you're using. That isn't saying that there isn't room for improvement -- most notably, package management in Python flies in the face of "there should be one -- and preferably only one -- obvious way to do it" -- but the tools we have work quite well.


I use pyenv[1] and the pyenv-virtualenv[2] plugin and I've not had a problem. It's so easy.

[1] https://github.com/pyenv/pyenv [2] https://github.com/pyenv/pyenv-virtualenv


pyenv needs to have its shims in place by running the pyenv init. You can run it when your shell starts but I find it kind of slow and for a while it used to be wonky in fish. But once I run the init, pyenv does work.

That's just for managing your python installation and virtualenv though. You still need to manage your packages and for that you have options like requirements.txt, pipenv (not pyenv lol), Poetry, and others.


I might steal these aliases, thank you.

Using virtualenv directly has also been my approach, and has not failed me yet.

I also used Poetry for one of my personal projects, and I liked what I saw.


I have struggled with conda and the huge space it usually eats up

I should learn to use venv properly

Thanks


Agreed. I tried the new package manager combined with venv and using venv directly seems best. A lot faster for a start.


It sounds mean to say it, but it's 100% true. I moved away from using python wherever I can. I've had colleagues struggle for days to install well used packages like pandas and numpy in conda.


I just began writing Python a few months ago. For years prior, I'd been a JS dev, and while NPM can be frustrating at times, I never encountered so many issues as I have in Python. It's crazy.

I'm now curious whether there are languages out there that do have a really nice packaging system.


FWIW, I find Cargo to be one of the biggest reasons I like Rust so much — maybe even more than anything to do with Rust itself or safe code.

I’ll often look for command line tools written in Rust, but not because of Rust fanboyism, but because I know I can just git clone the project and immediately start hacking on a new feature I need or a quick bug fix. In almost every other language I have to jump through one million hoops before I can build and run whatever it is, let alone have a nice developer experience (autocomplete, go to definition, etc).


Yeah, one of Julia's best decisions was taking heavy inspiration from Rust for the package manager. Rust was 100% the first language to get dependency management right.


> Rust was 100% the first language to get dependency management right.

In my experience, Java, Go, PHP, NodeJS have all got similar package management that works.


See my above comment, there's a reason why all of your examples work:

Java package managers tend to install packages written in java

Go installs packages written in go, and maybe C using cgo

Cargo installs packages written in rust

php package managers install packages written in PHP, extra extensions are rare

etc

People having trouble with python are NOT having trouble with python. They are having trouble because they are trying to use packages that are just python bindings to absolutely massively complex c++ and Fortran libraries.

Often people using python don't even have a C compiler installed (let alone a fortran one for the scientific stuff), so pip blows up the first time they try to install a package that hasn't been pre-built for their system+python version.


Yeah, npm was the first good package manager. It gets a lot of hate but my experience is that its strategy is the optimal solution for the problem it solves. And, I think a lot of things people complain about (lots of trivial packages, huge dependency trees, etc.) are an effect of solving the packaging problem well: if you make it easy to add dependencies, people will take advantage of that to add lots of dependencies.


Personally, zero complaints about Cargo (Rust) and very minimal complaints about NuGet (C#/.NET). My issues around NuGet are probably self-created because I refuse to learn the CLI [0] for it and I've had occasional issues with Visual Studio's UI for managing things.

https://learn.microsoft.com/en-us/nuget/reference/nuget-exe-...


In a lot of ways, Paket is significantly better than NuGet, if you ever want to try something new :) It uses a lockfile approach like Cargo, has better dependency resolution, etc https://fsprojects.github.io/Paket/index.html


The JVM (maven) has quietly had everything working really well for decades. You rarely hear much about it because it just works, and what you hear is mostly people hating on it because it wouldn't let them shoot themselves in the foot. Cargo works much the same way AIUI.


Same here, I've been using yarn for years, and when I started using venv, I didn't understand why it had to be so complex. Even after reading this article, I still don't see why it is so complex! Yarn/npm has the right idea: dependencies go in the working folder and expect that hierarchy/protocol. Problem solved. The only problem I have with yarn/npm is the problem any package manager has and that is the attrition of dependencies and how to rank their security risk.


I can't think of any package/dep systems I actually like other than npm. And they're even starting to screw that up with the `import` weirdness instead of the `require` that's been so simple and easy.

Rust's system is probably the next best.

ObjC/Swift packaging is a flaming disaster in practice, unless it's improved since I jumped that ship. Last time, I remember every single project having to rely on Cocoapods.


Weirdness? `require` was Node weirdness because Javascript lacked imports/exports at the time. The ES6 syntax is remarkably better, allows imports to be async, and doesn't need to run the code to see what should be assigned to `module.exports` allowing it to be statically analyzed which allows tree-shaking. Node's CJS syntax will only work for Node requiring that you transpile and bundle it for browsers. The ES6 syntax will work for Node and the browser.

I see anyone sticking with CJS syntax the same way I see Python devs who continue writing 2.7 code by choice in new projects and not because they are maintaining older projects.


Import is weird because there are several different ways to do the same thing, listed at https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe... and summarized under "there are four forms of import declarations." Four! And I always forget the different ways they work.

Sure tree-shaking and browser support are nice, but they didn't have to make the syntax this complicated to achieve that. Not an issue with other languages.


I'd argue three. Namespace import is really just giving a name to the idea that "you must provide an alias to be used as a namespace if trying to do a wildcard import to avoid naming conflicts" because executing JS in a browser would otherwise not be able to detect naming conflicts - you'd end up overriding values of the same name with the most recent import which is almost never desired behavior. Think about how you might resolve the issue of two different modules exporting a function with the same name otherwise. "Mandatory namespacing" fixes the problem.

The "weird one" of the remaining three is side effect imports which isn't all that weird when you realize you're not assigning it to anything. Functionally this is the same as calling a function rather than assigning a function to a value. eg `myFunction = function myfunction() { //stuff }` vs `myFunction()` and when you think about it like that it becomes significantly less weird but also something you rarely want to do - it's mostly used for polyfills. Good to know it exists but you can probably ignore that it does.

So now you're left with two: Default and Named. Use Default when you want the entire library - or almost the entire library. Used Named when you want specific pieces of the library. That's all there is too it really. If I want a specific function from a library - there's no reason to import the entire library. For a while you'd mix both Default and Named exports for React due to how transpiling worked - this React blog post explains it well: https://reactjs.org/blog/2020/09/22/introducing-the-new-jsx-... but you don't really have a reason to mix the two in modern codebases.

Named imports tend to be preferred because Default imports means giving a name to it which can result in inconsistencies across a codebase when many people are working on it. (eg: `import SumTwo from 'sumtwoNumbers.js'` vs `import AddTwo from 'sumTwoNumbers.js'`. A named import `import {SumTwoNumbers} from 'sumtwoNumbers.js'` solves this problem)

There's still one final little "gotcha" - there can only be a single Default export. Generally it's an object that contains "everything" but it doesn't need to be and those times are the only edge cases you'll run into though I can't say I ever have encountered this so it is a "theoretical" reason to avoid Default imports but I can't say it's ever been an issue in practice.

I guess I avoided a lot of this weirdness by basically only ever using the ES6 syntax and preferring Named imports (and not being stuck in the React ecosystem). The CommonJS got to avoid some of the "weirdness" because it could pretend the Browser doesn't exist (and leave handling it to bundling tools). So I guess I'll capitulate and say it's a little weird but you can basically ignore it and used Named imports as the "One True Style".


Your explanation makes sense, but imo the import syntax shouldn't even require (no pun intended) an explanation.

The bigger thing is, I'm subject to however the deps I use want to export things, so they use a mix of those. Maybe in some cases you have to use `require` even if you don't want to, I forget.


I remember trying to use cocoapods back in 2015/2016, right around the time that Swift was technically available but not ready for production. I literally gave up trying to import packages, it was a shitshow.


I first used Swift at the same time. Cocoapods actually worked, but only after fighting it all day. Swift was recommended over ObjC, but it was broken. The compiler itself would segfault if I used the wrong map syntax. If the compiler worked, it took about 20X as long as an ObjC build. Core Data managed to produce non-optional objects that were nil in some cases.

Swift got fixed over time (which is why every basic SO question has 20 different answers for each Swift version), but it still sucks, and so does UIKit, and Xcode. That whole toolchain has been relegated to being just a dependency behind React Native for me. I mean look at the shitfest involved just to get a substring https://stackoverflow.com/questions/39677330/how-does-string...


I try not to hate on projects publicly, because I know a lot of devs smarter than me pour their sweat and tears into these things. But imagine releasing a new language in 2014 and fucking up strings.


Yeah my patience for Apple's native dev environment is down to nothing nowadays. The docs used to explain why not having a string length method is the right move, but an overwhelming amount of "wtf" from users got them to change it finally. At least I'll bash my own work just as much if I think I made a mistake.


Cargo (Rust) is pretty solid. Most of my minor complaints (like being unable to add packages from the CLI) have been resolved with time, as well.


IMHO, Go’s packaging system is very pleasant to use.


Elixir's packaging system is quite good. We went from empty project to working stable diffusion in 2h. 1.75 of those hours was installing CUDA.


Things like pandas and numpy are not python packages. Yes, they are packages FOR python, but they are not python.

https://hpc.guix.info/blog/2021/09/whats-in-a-package/ does a good job of explaining why installing packages like that is a complete shitshow.


Your colleagues should consider skipping conda and stick to using venv. It will make life much easier. Given pandas/numpy is huge in data science, moving away is not much of an option unless you are working on a personal project or already have a dedicated team comfortable with using a different stack. There is also the Docker option which is great but much more involved.


My advice would be to not use Conda or other such "extra tooling". More trouble than benefit. Stick with venv and poetry.


poetry is "extra tooling"


It's really nice though.


Until it hangs at dependencies resolution step which happened to me recently on a fastapi/sqlachemy project. Had to add deps one by one not to overwhelm it (rollseyes).

Also doesn't play nice with publishing to custom pypi destinations (e.g. self-hosted Gitlab) in my experience. I could track down the issue but the code around was clearly a mess so that I gave up on that one.


also there's like 3 different flavors of virtual env now and me being 8 years out of date with my python skillz i have no idea what the current SOTA is with python venv tooling :/

i dont need them demystified, i need someone smarter than me to just tell me what to do lol


> and me being 8 years out of date with my python skillz i have no idea what the current SOTA is with python venv tooling :/

It doesn't really matter, by the time you sit down and use it you'll find whatever that is, has also been deprecated and replaced by 2 more.


The problem with software development in 2023 in a nutshell, well played sir.


The reality is that if you ask 3 different people you're going to get 3 different answers. They're fundamentally the same, just a matter of package management. As far as I'm aware, the current "SOTA" is Poetry. I liked Pipenv for quite some time, but Poetry is just so much faster IME.


It also makes it very hard for new devs willing to learn Python. Coming from Ruby and JavaScript, you just use bundler or npm, but Python is so strange, even the way it runs files is different, with the module thing.


> i dont need them demystified, i need someone smarter than me to just tell me what to do lol

Dockerfile ;)


You’re 100% correct, I think. But it’s notable that in that case, an extra functional programming language would make things worse by dividing effort, not better.


Because 14 wasn’t enough. https://xkcd.com/927/


Actual AI researchers have wildly different opinions on this; IME, going off all the AI researchers I’ve talked to, they tend to split about 50/50.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: