Hacker Newsnew | past | comments | ask | show | jobs | submit | spacebanana7's commentslogin

The last time the Duke of York was arrested was in 1483. And before that, the most recent prior was in 1452 during the War of the Roses.

Blimey, he's older than he looks.

I wonder, does the streak technically still continues? As he was stripped of his titles.

Yeah it requires an act of parliament to scrap the Duke title, neither the King nor voluntary resignation can do that

I stand corrected.

Defence firms like Raytheon are often happy to pay for stuff like this. What happens afterwards with the exploit is anybody's guess. Source - a vague memory of a Darknet diaries episode.

The headline that PyTorch has full compatibility on all AMD GPUs would increase their stock by > $50 billion overnight. They should do it even if it takes 500 engineers and 2 years.

Does anybody really understand why this hasn‘t been done? I know about ongoing efforts but is it really THAT difficult?

You know it’s probably a combination of things but mostly that AMD do not have a capable software team… probably not the individuals but the managers likely don’t have a clue.

Aaaaaaand torch is not a simple easy target. You don't just want support but high-performance optimized support on a pretty-complex moving target... maybe better/easier than CUDA but not that much it seems.

The political answer is that open justice provides ammunition for their political opponents, and that juries also tend to dislike prosecutions that feel targeted against political opponents. See palestine action as a left wing example and Jamie Michael's racial hatred trial as a right wing example.

Obviously the government Ministry of Justice cannot make other parts of government more popular in a way that appeases political opponents, so the logical solution is to clamp down on open justice.


The names of minors should never be released in public (with a handful of exceptions).

But why shouldn't a 19 year old shoplifter have that on their public record? Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?


If you prohibit the punishment of minors, you create an incentive for criminals to exploit minors.

Why are we protecting criminals, just because they are minors? Protect victims, not criminals.

Unfortunately reputational damage is part of the punishment (I have a criminal record), but maybe it's moronic to create a class of people who can avoid meaningful punishment for crimes?


> If you prohibit the punishment of minors, you create an incentive for criminals to exploit minors.

This - nearly all drug deliveries in my town are done by 15 years olds on overpowered electric bikes. Same with most shoplifting. The real criminals just recruit the schoolchildren to do the work because they know schoolchildren rarely get punishment.


We protect minors because they are children, and they are allowed to make mistakes.

At a certain point, we say someone is an adult and fully responsible for their actions, because “that’s who they are”.

It’s not entirely nuanced—and in the US, at least, we charge children as adults all the time—but it’s understandable.


But you create an incentive for organized crime to recruit youth to commit crimes and not have to suffer the consequences.

At a certain point, poorly thought out "protections", turn into a system that protects organized crime, because criminals aren't as stupid as lawmakers, and exploit the system.

There is a big difference between making a mistake as a kid that lands you in trouble, and working as a underling for organized crime to commit robberies, drug deals, and violent crime, and not having to face responsibility for their actions.

The legal system has so many loopholes for youth, for certain groups, that the law is no longer fair, and that is its own problem, contributing to the decline of public trust.


> working as a underling for organized crime to commit robberies, drug deals, and violent crime

Have you ever considered that these children are victims of organized crime? That they aren't capable of understanding the consequences of what they're doing and that they're being manipulated by adult criminals?

The problem here isn't the lack of long term consequences for kids.


I used to be a drug dealer so I know what is going on and they aren’t victims, they are willing recruits.

12 year olds know it’s not right to sell crack.

The problem is the gap between lack of legal opportunities for youth and the allure of easy money, status and power from being a criminal. Doesn’t help that the media makes it look so fun and cool to be a gangster.


12 year olds selling crack aren't making a rational decision based on the lack of sufficient long term consequences.

So what exactly would worse long term consequences do besides ruin the life of kids making bad decisions?


What's the alternative? A 14 year old steals a pack of gum, and he's listed as a shoplifter for the rest of his life?

Just because exceptions are exploitable, doesn't mean we should just scrap all the exceptions. Why not improve the wording and try to work around the exceptions?


If you don't think this crime is a big deal, then why do you think this crime would matter if it was in the public record tied to their name? These two ideas you have are not compatible.

I don't think stealing a pack of gum at 14 years old is a big deal, but many people have a huge problem understanding proportionality: To them, it's binary. You're either a criminal or not a criminal, and if this kid's record shows "shoplifter" until he dies, a significant number of people, including employers, will lump him into the "criminal" bucket for the rest of his life.

And what about the kids who get recruited for gang activity and do some pretty messed up stuff as kids? Should that not appear on a public record? This is where the problem lies, you essentially can only ever make it an all or nothing approach as it gets a lot harder to determine what should or shouldn't be apart of a public record. Especially since as you reflected in your comment, this becomes and opinion thing on whether someone thinks it matters or not what crime they did as a kid.

The problem that is happening in most Western countries is that criminal organizations take advantage of the fact that minors get reduced sentenced and that their criminal records are usually kept sealed (unless tried as an adult). Whether it be having them steal cars, partake in organized shoplifting operations, muggings, gang activity, drug dealing, etc...

Your reasoning for why this information shouldn't be public record seems to boil down to the fact that you don't agree with other peoples judgement of someone's past crimes. You'd like to see more forgiveness, and you don't think others will show the same forgiveness, so you want to take away all the access to information because of that. To me that seems like a view from a point of moral superiority.

I'd rather people get access to this information and be able to use their own brains to determine whether they want that person working there. If you were involved in shoplifting at 17 years old, and turn 18, I think it would be very fair for a store owner to be able to use that information to judge you when making a hiring decision. To me it doesn't make sense that you turn a magical age of 18 and suddenly your past poor decisions vanish into a void.


I think we can at least agree that children recruited by organized crime to steal cars, break into homes, assault people, and so on, should be treated differently than a kid who stole a pack of gum from a store. Whatever the solution is, it has to take into account the seriousness of the crime, and it has to discourage this binary criminal / not criminal thinking.

> Why are we protecting criminals, just because they are minors? Protect victims, not criminals.

Protect victims and criminals. Protect victims from the harm done to them by criminals, but also protect criminals from excessive, or, as one might say, cruel and unusual punishment. Just because someone has a criminal record doesn't mean that anything that is done to them is fair game. Society can, and should, decide on an appropriate extent of punishment, and not exceed that.


Would you want the first thing to show up after somebody googles your name to be an accusation for improper conduct around a child? In theory, people could dig deeper and find out you won in court and were acquitted, but people here should know that nobody ever reads the article...

If you were hiring a childminder for your kids, would you want to know that they had 6 accusations for improper conduct around children in 6 different court cases - even if those were all acquittals?

As a parent, I would want to know everything about anyone who's going to be around my children in any capacity. That doesn't mean I have a right to it, though.

>openly admits his beliefs results in parents not making good decisions on who to allow near their children, keeps going anyway

great moral system you have there


That's a bad faith take.

In one comment you managed to violate a whole bunch of the HN commenting guidelines.

https://news.ycombinator.com/newsguidelines.html


how else would you interpret admitting you don't think parents should have a right to know the backgrounds of the people with access to their children before making informed decisions on whether or not to allow it?

please, show me your good faith interpretation and i will take back my comment


Nobody gets to have unbounded information about others. It's weird that you think there should be no privacy constraints.

Why are you saying unbounded when the discussion is about court proceedings and convictions? There is a clear and consistent boundary here, no one is asking for search logs and round the clock surveillance.

what if these “others” voluntarily apply to a position where they have regular contact and help take care of your children? is it ok then to be informed on whether or not they are a convicted child rapist?

The UK has an official system [1] for checking whether people should be allowed to work with vulnerable people.

[1] https://en.wikipedia.org/wiki/Disclosure_and_Barring_Service


If it was reported in a newspaper then that would likely already be the case.

> Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?

Yes


It is the UK we're talking about after all...

Where the accused have rights too?

Where the journalists have very little rights, and people posting their bad (wrong) ideas (think) even less so.

Not according to the wpfi world press freedom index where it is ranked 20th.

Where speaking truth isn’t a right or a defense

Most media in Europe are required to ambiguate names of criminals. For instance by removing the first name or the last name.

The movie industry is doing well from AI.

Thus far AI has only been used to create fan fiction clips that generate free marketing for legacy IP on TikTok. And the rights holders know that if AI gets good enough to make feature length movies then they'll be able to aggressively use various legal mechanisms to take the videos off major sites and pursue the creators. Long term it could potentially lower internal production costs by getting rid of actors & writers.

Music is very different. The production cost is already zero, and people generating their own Taylor Swift songs is a real competitive threat to Spotify etc.


Just right now: ByteDance to curb AI video app after Disney legal threat

https://www.bbc.com/news/articles/c93wq6xqgy1o


There were some occasions he replied to questions as "not for email".

If it’s a commodity then it’s even more competitive so the ability for companies to impose safety rules is even weaker.

Imagine if Ford had a monopoly on cars, they could unilaterally set an 85mph speed limit on all vehicles to improve safety. Or even a 56mph limit for environmental-ethical reasons.

Ford can’t do this in real life because customers would revolt at the company sacrificing their individual happiness for collective good.

Similarly GPT 3.5 could set whatever ethical rules it wanted because users didn’t have other options.


The Nissan GT-R in Japan is geo-limited to only being allowed to race on race tracks.

You mean the standard 180kph speed limiter (which is on all cars in Japan) is removed on the GT-R when it's on a track based on GPS. There's nothing stopping you from racing it up to 180kph on the street.

Startups fail because of a lack of adoption far more often than by any other reason, including competitive and monetisation factors.

If your developer company gets popular you’ll be rich enough anyway. You might need to choose between screwing over your VCs by not monetising or screwing over your customers by messing around with licences.

But yourself as a founder will likely be okay as long as the tool is popular.


This is not necessarily true. Wrong monetization can be the killing blow. Market can change and your business model which used to work can suddenly fall apart. A recent example for business model change is Tailwind where traffic to their open-source docs plummeted and suddenly not enough people are upgrading to their commercial licenses.

Startups die for a variety of reasons, even if products are popular and loved.


Tailwind was (is?) also selling "lifetime" licenses, which means eventually their sales would collapse anyway, once they have sold a license to most interested customers. They were always going to need to pivot at some point. regardless of traffic to their docs.

To play the devil's advocate, more people are born every day and as long as there are more developers today than there were yesterday, lifetime licenses can bring in a trickle of money each month, especially if the marginal cost of each new customer is zero or near zero.

True enough, though I think Tailwind suffered something of a black swan event of having lifetime pricing plus AI coding assistants hitting an inflection point that immediately and thoroughly decimated the value prop of their core monetized product.

Especially as most AI safety concerns are essentially political, and uncensored LLMs exist anyway for people who want to do crazy stuff like having a go at building their own nuclear submarine or rewriting their git history with emoji only commit messages.

For corporate safety it makes sense that models resist saying silly things, but it's okay for that to be a superficial layer that power users can prompt their way around.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: