Hacker Newsnew | past | comments | ask | show | jobs | submit | jedberg's commentslogin

I think it's code for "the government will have to bail them out".

Seems like Sam was angling for that with some of his China vs USA rhetoric

Why would they need a bail out? Their assets can be sold off, they can be taken over or be absorbed by another American entity.

Absolutely no reason for a bail out.

It may hurt the ego of Altman and Brockman - but that's their problem.


If DoD systems are running on OpenAI infrastructure, you can't just pause them for 6 months during an acquisition. This gets far more complex than just "liquidation of assets".

Because their assets would have been vastly overvalued. The bailout is when the government buys those assets at as close to that fictional valuation as they can, and likely then sells them back at their actual worth.

> Absolutely no reason for a bail out.

There's never been any reason for a bailout. It's just handing tax money to wealthy people who have made bad decisions.


The contract with the Pentagon is a good first step. Being a government contractor is pretty fail safe.

At some point you reach a size when too many politicians and the people who own them have invested so much money that they're willing to take any size political hit in order to save themselves from personal losses when you fail.

We need a law that says if you hold any data about a person, they must be notified when anyone accesses it, including law enforcement.

I used to work in criminal investigations. I understand how this might make investigation of real crime more difficult. But so does the fact that you need a warrant to enter someone's home, and yet we manage to investigate crime anyway.

Your data should be an extension of your home, even if it's held by another company. It should require a warrant and notification. You could even make the notification be 24 hours after the fact. But it should be required.


The entities holding the information here are literally police departments. The information itself is evidence, used in active criminal investigations. It's good to want things, though.

The information is not in any way restricted to use in active criminal investigations, and further, has been found to frequently be used for a variety of other purposes.

It's a bit like saying pornography is used in the study of human anatomy.


I don't know what you're talking about. I'm talking about the legislation 'jedberg proposed.

~jedberg is talking about a hypothetical law that would apply to ALPR data. In reply, you said "The information itself is evidence, used in active criminal investigations." ("The information" here referring to ALPR data.) (You also said, "The entities holding the information here are literally police departments.", but I don't see that that's relevant unless we choose to believe that police departments are more deserving of public trust by default than any other organization.)

I was replying to the "used in active criminal investigations" part. Yes, the ALPR data managed by Flock is sometimes used in active criminal investigations. However, it's also used for many other things.

The many other things that it's used for supports ~jedberg's argument.


I know, that's why I said "including law enforcement" :)

So we're clear, you believe there should be a law that, when a police department collects information about you during a criminal investigation, they should notify you directly that they've done so?

It does make sense. Police are absolutely not beyond reproach, and there's screwups all the time. They need to be held to a high standard.

It's also easy to imagine reasonable compromises, like a time delay where they only have to report after e.g. 48 hours, and allowing a system whereby a judge can issue a warrant to extend that delay.


That's more or less what a search warrant is, so yes.

Yes.

> Your data should be an extension of your home, even if it's held by another company.

Nice idea, but at least in the U.S. (with the lone exception of LE obtaining cell phone location records), courts have consistently held that if you give your data to someone else, you are no longer entitled to an expectation of privacy in it. https://en.wikipedia.org/wiki/Third-party_doctrine

If you want your data to be considered an extension of your home, at least for now, keep it at home.


Nice idea (2), but many companies and govt agencies force one to give lots of data or you will not be receiving services, sometimes very important services.

I think the notion that data would be a home is beyond weak, but the explanation you gave for why isn't solid either, since the objects of data do not need to and in this case haven't consented.

That is, recordings of people in public settings (in some jurisdictions) are property of the recorder, but it still isn't a home (just imagine how that would work in some jurisdictions, someone takes a picture of you and it's trespassing? Would you be able to shoot them?)


Is there not some concept that utilizes cryptography in a way such that information about people is accessible, but if it's accessed, then the access request is added to a ledger (akin to blockchain) such that who made the access, when, and about whom becomes provably public knowledge?

We'll sooner get a law that will forbid notifying a person when such data is passed to law enforcement.

Trying to understand the position here.

This would be excluding gag orders correct?

And regular orders currently notify the service provider, but they don't necessarily notify the target, they just don't have a prohibition on the service provider notifying the target.

Finally, recordings of public areas actually aren't be impacted by warrants at all, right? But what you are saying is not just that LEA would need warrants to look at public recordings from a willingly cooperating camera owner, and that the warrants can't be gag orders (unless specified), but that the targets must be notified, even if the subject under search were someone else, the fact that I'm included in a recording would compel the LEA to notify me?

And how exactly would I be notified? Wouldn't that necessitate even more privacy invading features like facial recognition and a facial to contact information technology? Not an uncommon paradox.

Again, just want to understand the position, my position might leak as the question being leading, but I can't help it.


Alternatively, one could create serious civil damages for those capturing surveillance imagery that causes various harms including false prosecution for any data they collected, even if it was unlawfully taken or used after it was collected. ... then let the liability work out the problem by making it too risky to run non-targeted mass surveillance apparatus.

This would avoid having to define what is and isn't a mass surveillance system. Any camera recording off your property would have a legal risk for the operator-- but if you're just recording locally and only using it to discourage or solve crime you're suffering the risk would be minimal and justified.


We also (I worked there at the time) had software that basically said, "Joe watches all of his disks every weekend and drops them in the mail on Tuesdays, let's just assume he's going to do that and ship his new disks Monday morning". And other such predictions.

If you had a very regular viewing behavior you could have your new disks the same day as you shipped your old ones. To the customer, it was magical.


> And why does your comment say you're a 30-person company but the title says 60?

AI hallucination? :)


Reading through it, I didn't see any mention of write access. It looks like the agent is strictly read-only with access controls.

The thing that AI is best at is summarizing vast quantities of information. That means the most natural thing for an AI to do is be "the one tool to rule them all".

The more information it has access to, the more useful the answer can be. But that also means that it can answer all the questions.


>> The thing that AI is best at is summarizing vast quantities of information

by definition a summary is the best at nothing though, and the mentality that the best way to rule is from a single summarized interpretation is both flawed and scary. It's not answering all questions; it's attempting to provide a single summation dramatically influenced by training. Go ahead and incorporate this into your balanced and multi-perspective decision-making process, but "one tool to rule them all" is not the same thing and definitely not what we're getting.


"If all you have is an LLM, every problem looks like summarizing information."

Emphasis on looks like ;-)


> the mentality that the best way to rule is from a single summarized interpretation is both flawed and scary.

Very much agree. This reminded me of Project Cybersyn [1], an attempt by socialist Chile to build a central heavily-computerized room that would summarize their entire economy to a few men literally pushing the buttons. Complete with 70s aesthetics and Star Trek TOS feel.

[1] https://thereader.mitpress.mit.edu/project-cybersyn-chiles-r...


Not until it's context window and attention is infinite.

It's best at summarizing/processing modest amount of information quickly. But given more, its usefulness drastically decreases. This demand toolings that divide the amount of information and flow.


“Not until it is context window”???

this has exceedingly obvious limits. The primary limit is the context pollution that happens when you give it too much context.

Elon and the rest of AI crew who claim LLMs can just forever grow is not realistic or held out by real world testing.

It can do "everything" but by everything, it'll still be fine tuned and harnessed and agentified which isn't really the idea that the model can do everything.


I make holiday light shows with an open source program called XLights[0]. I'm sure you've seen the videos[1] of what people[2] can do. Usually the top comment is "man that is cool but I wouldn't want to be their neighbor!" followed by "my neighbors love my light shows".

Creating the sequences is time consuming, and lot of people end up buying them or sharing them, but those are rarely as good as the ones you make for yourself.

Some folks have dabbled with using AI to create the sequences. I think the biggest issues are lack of training data and it's a very visual art, so there needs to be a better feedback between the text representation and the visual manifestation.

So if you're into using AI to make physical world things better, that would be a good place to look!

[0] https://xlights.org

[1] https://youtu.be/enhhtPZMwCE?t=119

[2] https://www.youtube.com/watch?v=z5dfpe_-Lgg


Warning: Do NOT click on [1] unless you have 25 minutes to spend transfixed on a video. Holy cow...

It takes about 10 hours per minute of song to make a sequence like that. Imagine if AI could help speed that up!

I wonder if you can break down the sequences into segments (parts) and then the AI doesn't have to know how to control LEDs directly, but can instead put sequences together in accordance with the music.

Maybe even process MIDI files somehow ...


When we do it as humans, that's basically how we do it. We may have an overall idea for a theme across the song, but usually you're zoomed into a few seconds of music and adding light effects to it.

I assume you've seen this https://news.ycombinator.com/item?id=47675446 which is related

I had not, thanks! Interestingly, using FFT for this has been around for a long time, but combining it with transformers could have interesting new results.

It's interesting to me that you can have something like this that is "hard to build" but "easy to verify" - humans are really good at telling if something is "off" about the visualization.

And the guy next to him is just staring at his phone, probably thinking, "I'm not even gonna ask".

Although if it were me I'd probably annoy the heck out of him asking why he had a Wii on the airplane!


Looks like a Switch 2 actually.

You're totally right, it's a Switch 2.

I don’t get it, how is he getting OS X running “on a Wii” on a Switch 2?

It's what the guy next to him has.

> their tech is used to bomb children

If you're talking about the school in Iran, that wasn't OpenAI. That was a Palantir system that pre-dates OAI by a few years, and was due to a bad entry in a spreadsheet, that showed the building as military housing. Which it was a few years ago.

180 people lost their lives because of bad data in spreadsheet, but not AI.


180 children lost their lives because of decisions by people in the US military (and ultimately the US government / the POTUS).

Let's not fall into the trap of adopting narratives created to waive accountability. The spreadsheet didn't launch a missile, the spreadsheet didn't authorize the strike and the spreadsheet didn't select the target.

Not to mention that "outdated spreadsheet" is also a hilariously anachronistic excuse for a war crime if you consider what kind of satellite technology the US has publicly acknowledged to have access to, let alone what kind of technology it is likely to have access to.

The difference between intentional premeditated murder and reckless endangerment resulting in a killing is not guilt and innocence but merely the severity and nature of a crime. Both demonstrate a callous disregard for the sanctity of human life, one just specifically seeks to extinguish it, the other merely accepts death and suffering as an acceptable outcome.


Please talk to your criminal defense lawyer.

This is nonsense.


Palantir was using anthropic and its use is being replaced by openai.

Yes but not for the system that decided to bomb a school. That was a Palanter in house system.

Afaik the palantir system utilized ai.

Yes, their own in house AI. That system doesn't use Anthropic or OpenAI.

Many years ago. Not "a few years ago". Also you could make the sentence that 180 people lost their lives because of an evil war, of which USA and Israel are the aggressors. And we definitely don't talk enough about that part.

The Dodgers could have so easily turned this into a huge win. After 50 years they could have just awarded him a paper lifetime pass. Scan this and get in for any game! It would have been so easy.

Or if they really wanted him to go digital, just buy him a smart phone and install the app for him!


No smartphone. A cheap wifi-only Android tablet without a lock screen and their stupid app on the home screen.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: