Hacker Newsnew | past | comments | ask | show | jobs | submit | dangus's commentslogin

I have a feeling that the Open Document Foundation is going to end up being the loser here. Collabora is the entity that can fund development with a commercial offering. It sounds like they employ the core contributors to the project as well.

Regardless of who "wins," I'm just here to say that I like OnlyOffice a lot better and switched away from LibreOffice. I like that it just looks more like a modern program and overall feels less clunky.


Make sure to backup regularly. I don't know how good OnlyOffice is these days, but it definitely has (had?) a terrible history of quality control. We migrated off it a couple of years ago after losing several days of work due to severe (and, as it turned out, widely known) bugs in how it handled changes/document version tracking.

OnlyOffice is not really open source. They say they are but they also add impossible conditions to their license. (you are forced to use their logo, but you are also not allowed to use their logo.)

That doesn’t bother me. I’m just looking for a free office program that runs offline and works well.

It looks like the Euro Office suite will improve upon it when it launches and remove the remaining downsides to it.


I’m not sure why that’s a question, it’s just a downloaded file. You can even watch it download separately when you enable Apple Intelligence (it’s not tied to OS updates from what I can tell).

Of course I imagine Apple is not going to be the fastest mover in this regard. I’m not even sure they believe the product will be widely impactful anymore and may keep it relegated to a small list of popular use cases like photo touch ups and quick questions to Siri. For me the most useful parts of Apple’s AI don’t even require me to enable Apple Intelligence.


When I read articles like this my reaction is always “put up or shut up.”

If you have a better idea, make it happen.

The author merely described the parameters of a solution and didn’t even attempt to solve it.

In essence, we aren’t even left certain that a better solution that satisfies all stakeholders is possible.


It will not only cut off icons but the menus for applications when they have a lot of them. There is no way to fix it except to change your scaling or connect a second monitor.

I should save this thread for every time someone tries to tell me that Windows is a horrible operating system that is a major reason to not buy a computer when I say things like "The MacBook Neo isn't that good of a deal and you can totally find a Windows laptop in the price range that's built well enough, has similar performance/battery life (or better)/trackpad, and leaves you with more RAM, storage, and I/O."

I've literally picked out laptops that are clearly better buys than the Neo/Air and people will tell me things like "well then you're stuck with Windows" or "but you'll have firmware problems" and then we have to remember that Apple has had plenty of that in their past.

How about those Nvidia GPUs that would fail inevitably in older MacBook Pros?

Or the butterfly keyboard?

Or how they can’t even make window corners that match with the Liquid Glass update?


Do you have a suggestion for a Windows laptop that’s a better buy than the MacBook Neo? I kinda want a Windows laptop (for being able to run simple games, mostly) but not sure which one

Walmart is selling a HP gaming laptop with 16GB RAM and 512 GB SSD for $699—same price as the Neo.

Keep in mind it's not Magic Mac Memory because someone will jump in and tell us that 8GB of Mac memory is clearly superior to 16GB of PC memory because Macs are able to swap and wear down your SSD in the process.


I have pointed out this stat before:

Global user base

Mac: 100 million (2024)

PC gamers: 900 million

A lot of Mac enthusiasts seem to scoff at the idea that someone buying a laptop wants it to be able to play some kind of video games. Apple can make the greatest computer in the world but for many customers the fact that it can only run ~5% of games or whatever is a dealbreaker.

The Neo can play many games on some level but having 8GB of RAM plus needing to share it between the CPU and GPU is a major disadvantage.

The lack of a fan also hampers performance of the chip inside of it by something like 15-30%, rather than including one for a nominal cost to maximize performance.

It’s totally fine for the intended customer but it’s a computer for a very specific customer, more niche/specific than a customer who “just wants to play some CS:Go on the side.”

Apple could swallow their pride and partner with Steam but they’ll never willingly encourage their users to use a different App Store even if it makes the computer better.


The ability to run without a fan is not a problem; it's a feature. Would you want a fan in your phone?

It's fine, but it's a design decision with tradeoffs, and gamers are prepared to make different tradeoffs (bigger and noisier are ok if they deliver a big enough performance jump).

Is that the tradeoff they make with the Nintendo Switch? I’ve never heard the fan in my Nintendo Switch and it’s a very compact device. My Nintendo Switch 2 is also very compact, smaller and lighter than a MacBook Neo, and it can play AAA games at high frame rates (e.g., Resident Evil: Requiem) while the MacBook Neo struggles with 5 year old titles like Cyberpunk.

This is a very comparable device considering it’s also an ARM-based computer essentially.

We need to stop making excuses for Apple’s unwillingness to include a basic form of cooling for their low end devices. It’s just price segmentation. Make the cheap stuff artificially slow, push you up to the MacBook Pro.


If you cannot hear the fan in a Switch, I implore you to get your hearing checked. It’s not a noisy fan, it’s not a problem the fan is there, but it’s not silent!

I don’t believe I ever said it was silent.

Do you not agree that having a fan in the system was a good design trade-off?


What about the fan in the Nintendo Switch? Do Nintendo Switch owners hear the fan or consider it a problem that stops them from making a purchase?

I don’t know why people parrot this talking point about a lack of fan being a positive feature. It’s like a shared propaganda talking point that Mac enthusiasts all agree upon universally. If Apple added fans to the Air and Neo you’d all change your tune since Dear Leader changed their mind, just like when Apple enthusiasts stopped blindly hating Intel suddenly during the architecture transition. You’d all say stuff like “Apple gave us boosted performance and you can’t even hear the fans! All those PC laptops that I’ve never cross shopped since 2001 sound like jet engines!”

A simple passive heatsink has been shown to boost performance significantly in the MacBook Neo.

The throttling of the chips in Apple’s lower end systems are an intentional form of price segmentation. The MacBook Pro won’t be any faster than the Air if the Air was just cooled properly.

I would unironically take a fan in my phone if it stopped it from throttling, dimming the screen, and halting charging when it’s a hot day in direct sunlight. It would just have to make sense in the context of a phone design, of course, which is a challenge.

https://support.apple.com/en-us/118431

There are phones on the market with detachable active cooling solutions to help with sustained intense gameplay:

https://rog.asus.com/phones/rog-phone-9-pro/


Nice of you to decide we’re just parroting instead of thinking.

If the MacBook Air had a fan, it would be thicker and would need a bigger battery. It would then be the same, aside from the screen, from the base MacBook Pro. You are 100% correct. The fact it has no fan allows Apple to reduce its weight and thickness. Thus reducing its price. You’re absolutely right.

Fans in laptops are more and more a gamer pilled flight of fancy. Phones and iPads have shown they’re not a necessity.


Removing a fan reduces the price? By how much do you think? Is the Nintendo Switch expensive because of the fan?

Is the Nintendo Switch/Switch 2 a thick device? They are thinner than the MacBook Pro, and they have more space constraints than a MacBook Neo.

If fans in laptops are just for “gamer pilled” why does the MacBook Pro have one?

Do you think Apple can continue to grow their marketshare indefinitely if they continually ignore the 900 million PC gamers who currently own Windows PCs? The PC gaming market is the only one that has been growing since 2021.

https://www.techspot.com/news/106371-gaming-industry-hits-wa...

The iPad doesn’t prove anything, it’s routinely criticized for wasting its performance potential with inflexible and limited software. Its performance limits are never tested because you can’t actually do things on it in comparison to a full desktop OS.

My phone will regularly dim the screen, halt charging, and throttle performance when I’m out in a sunny day during the summer. You ever been to Miami? I would actually be interested in an actively cooled phone if it existed and would accept a device that was thicker.


It's also a con. You get worse sustained performance. You also get a hotter device. There's a reason the base model M series MBPs consistently bench higher than the exact same chip MBAs in things like Cinebench. The fan.

As I’ve pointed out in my other comments, the Nintendo Switch and Switch 2 are perfect devices to dispel this whole “no fan is better” narrative.

Clearly it’s not a challenge to make a compact, performant device with a nearly silent fan. Clearly customers don’t mind that devices have fans even for devices meant to be held in hand for hours that weigh less than a pound.

I can buy a handheld from Nintendo for $450 that can play new AAA games with great performance while the Neo struggles with 5 year old titles like Cyberpunk despite likely having better overall hardware. A MacBook Neo with a fan would get 15-30% better overall performance and +50% framerate in games as has been demonstrated by multiple tinkerers on YouTube.


When he said games did you assume he meant Solitaire?

This can depend on what’s on sale in your region. I also have some thoughts about buying at this price range down below.

I’ll shill a website for a YouTuber called bestlaptop.deals. It tracks sale prices and has reviews attached for the laptops, along with categories for use cases. Shopping for Macs less frequently involves big sales but with Windows laptops being patient can pay off.

I’ve seen on recent reviews indicating Windows on ARM has made really great strides, from including support for anti-cheat for many online games. Not every game works but many do without any effort.

I bring these up because the battery life is excellent and many of them are in the $500-600 range.

https://youtube.com/watch?v=f8EbtQ7jQnQ

Yes, that’s a sponsored video, but I’m linking it to show you his commentary on the software situation.

For x86-based computers there are a couple of ways you can go:

Since you mentioned gaming, you can sacrifice some portability and go with something like a Lenovo LOQ. A previous generation unit will cost about $700 and have an RTX 4050, which is enough to beat anything Apple will sell you before you get up to Pro chips. I believe there are other OEMs that may hit that price point with an RTX 5050 which of course will be an improvement.

These systems do get good battery life when you’re in integrated graphics mode. When you’re gaming you’re going to be plugged in regardless of laptop.

Another one I’ve seen on sale lately has been the Yoga 7 14” with either the Ryzen AI 340/512GB storage or the Ryzen AI 350/1TB of storage. I think the sales aren’t as good as were a couple weeks ago. These have a 2K OLED screen, 2-in-1 and pen support, generally good overall systems. The 350 model has significantly better integrated graphics performance so I’d try to stretch for that one.

Finally, in-person I was really impressed with the Acer Aspire 14 AI for being only $530. I did wish the screen was a bit better but the rest of the system was really impressive to be hitting that price.

There was an HP OmniBook I played with in store that had a great aluminum build, though the value wasn’t quite as good. It seemed like it was designed to compete with the Air and felt to me like an Air clone in a way.

I haven’t touched on used, which is obviously an option. There are a lot of options there and I think it’s worth looking into.

I would still say, if you can, spend more than what the MacBook Neo costs. The MacBook Neo isn’t a revolutionary device that changes the game in my mind. Instead, it’s a machine that makes a lot of similar sacrifices that other cheap laptops make. It’s better if you save up and spend more if you can.

For example, you’re interested in gaming, you just missed an amazing sale on the RTX 5070Ti/32GB RAM version of the Zephyrus G14 at Best Buy. It was $500 off, so about $1800 for a really amazing machine that is basically the best thing and light gaming system on the market.

Also keep in mind as I talk about this, I’m biased against 15-16” models. I like 14”.


I’m with odo1242, where’s a $700 Windows laptop that has the Neo qualities? I like my Thinkpad - it’s currently my only Windows machine - but it was $1300 or so for the entry-level model (not going to count my add-ons).

I don’t love Windows, but I don’t hate it either. Amazing backward compatibility, and that is not to be ignored.


Yoga 7, check Best Buy, although I think the discount was bigger a few weeks ago.

It’s actually better in many ways: 2K OLED touch screen, convertible with pen support, double the RAM, backlit keyboard, ranks better on battery life for office tasks, a far wider array of ports.

If you stretch to the 1TB model you get the Ryzen 7 AI 350 which beats the Neo on integrated graphics and multi-core processor speeds. You’ll pay a little more but if you need the storage the Neo is out of contention already, and at that price your MacBook Air will come with 256GB.


That’s strange, there aren’t wider market supply chain issues outside of DRAM. Maybe your vendor is just throwing excuses around.

>That’s strange, there aren’t wider market supply chain issues outside of DRAM.

GPUs, ram, ssds, hdds, hell even CPUs are starting to climb in price. It's an everything shortage and it's only getting worse.

A workstation that two years ago cost $3,000 was $10,000 last month and $10,500 this month. There are parts which aren't available at any price.


Wait what? That's over 300%.

Between this revelation and that post recently on HN about the scanned receipts and egg prices, I find myself wondering if we're worrying about the wrong things.

We're seeing massive inflation in computing, but because the dollar is holding its value we call it increased prices. But the buying by the big buyers is the thing driving the inflation, its mechanism is scarcity.

But it's also localized. Only we experience this as a problem because compared to the hyperscalers we're poor.

The same idea applies to the price of groceries. As the prices increase, base increase being inflation, but logistic efficiency also plays a big role.

The effect is the same. The ones with more spendable income don't experience an issue yet in the projects nobody is eating fresh veggies.

The part that scares me is the creep, as I call it. Throughout the years I've always been able to carry price shocks and such but this time I'm out of the game. No more DRAM for me.

I then wonder if one day, without losing my job, I won't be able to pay for veggies.


At least with veggies you can stick seeds in the ground in the backyard.

My hard drive tree will take years to develop before it bears fruit!


DRAM and flash both seem to be up about 10x. HDDs are just impossible to buy.

Fuel price rises = logistics price rises.

You're right that fuel prices have risen. But usually the impact of fuel prices is mostly felt on bulkier, lower cost items first.

After all, a truck can carry a 10kg sack of rice, or a 10kg nvidia gpu. If shipping costs for 10kg rise by $15 the sack of rice has doubled in price, but the GPU is only 0.5% more expensive.


Not all increases in prices are reactive. Some are anticipated. Inertial inflation is real.

For a truck yeah, but across the ocean, it isn't quite that simple because GPUs and grains are sent in different types of ships (or different modes entirely) that aren't interchangeable.

You're right - perishable goods have to be shipped fast. Your bananas, berries, fresh fish, and not-fron-concentrate juice can't be on some slow-steaming container ship with the furniture, clothes, building materials and vehicles.

The GPUs can though.


Rice is a nonperishable grain. Grain ships in neither of those. Grain is shipped in bulk carriers

And the GPUs are such high margin that they all take an airplane anyway.

That is other “different mode entirely” that exists to go across an ocean :)

This is driven by AI datacenter demand, not fuel prices. RAM prices have actually dropped significantly in the last couple days as the Iran war hit and the possibility that interest rates might go up and pop the AI bubble sunk in. (Though let’s see where they go after the last couple days of whipsawing.)

It's driven by a whole bunch of factors but I agree it's largely driven by AI data center demand

But still 30% of the worlds helium production is apparently shut down and ships can't get to where they need to be as efficiently as they have been so there is going to be knock on effects from this.


Yeah. Not true. Or send me the name of your server vendor. I’m buying.

Having issues with both price and availability on NVMe, SATA flash, starting to see some CPUs, and for a personal project high density spinning rust (24TB+).


DRAM is up more than that 50% though.

Flash has supply (and price) problems too.

This isn't true: NAND flash prices are up too, though not nearly as dramatically. But the war means that fuel and shipping prices are way up as well.

They’re throwing something around.

I assume this is sarcasm.

SSDs and HDDs are being squeezed as well.

Don't forget SDCards

"Memory card prices have TRIPLED in the last few months: when will this madness stop?!" https://www.digitalcameraworld.com/cameras/memory-cards/memo...


Sony stopped making their cards entirely, which stinks because I'd settled on their pro cards for all my camera bodies.

We just had a vendor tell us none of the HDDs we were looking for were available unless we also committed to a full NAS offering.

“Killing” is strong phrasing.

Yes, a $250 mini PC I bought last year is now $350.

Is this pricing bad? Yeah, compared to what it was.

Is this the end of the world? Not really, and we’ve seen price spikes for all kinds of PC components in the past. It’s rarely permanent.


That sounds pretty nice. The same mini PC I paid $195 for in 2023 is now $450. Seems to be life in Canada sometimes.

It had caused me to look around though. I have found the Pi Zero 2W to be surprisingly capable for Pi sized jobs.


Not everyone earns tech bro salaries and can sustain a thousand cuts. Many hobbiests are scraping and saving money to acquire hardware. For some it very well msy be the end of their world.

We are talking about brand new latest gen hardware here. People with low budgets are always scraping and saving for deals and don’t need to buy something brand new from a pricey brand name like raspberry pi.

You can still jump on eBay and buy all kinds of dirt cheap used pieces of hardware.

My buddy just bought a used ThinkPad T14 with 32GB of RAM and 1TB of storage for about $500. You can get by with a whole lot less.

In this context, I will also present the idea that Rasperry Pi has represented quite poor cost value for many years now.


Have you looked at how expensive international shipping is? eBay covers just a few countries, the rest of us can't buy there because we'll be paying 10 times the cost of hardware to get it over here.

I already moaned about this recently, but to briefly reiterate: the only hardware that's becoming available for most people in my region are Frankenstein desktops built from heavily used 10+ year old Xeons running on suspicious motherboards made by obscure Chinese manufacturers you've never heard of. This is pushing ever more people towards smartphones and away from actual computers.

But at least we got the bullshit machine in return, that's something, I guess.


> Have you looked at how expensive international shipping is?

It really shocks me how bad shipping has gotten. It's nearly unaffordable to buy things on eBay from the US as a Canadian due to shipping costs, so I can only imagine just how bad it is for people from other countries.


It's probably unaffordable for anyone to buy things from the US due to shipping costs, because the Trump administration has completely screwed up everything there with tariffs and mismanagement of the USPS and more. But the US is not the world. A better comparison is how much it cost to ship things from China a year ago compared to today.

> Frankenstein desktops built from heavily used 10+ year old Xeons running on suspicious motherboards made by obscure Chinese manufacturers you've never heard of.

I've heard reports that these are actually surprisingly good. I wouldn't want to use one in a production environment, but for homelab stuff they're an incredible deal.


That cheap stuff from eBay that people talk about all the time seems to be available only in North America, or in the best case Western Europe.

ThinkPad T14 which generation?


Yes, 90%+ of sellers refuse to ship here (and we're not even under any sanctions and/or political pressure of any sort). I hear about these magical 100$ Thinkpads all the time; I'm yet to see anything cheaper than 300$ (add another 100$+ for shipping).

Sometimes goes the other way. I was recently looking for a specific PC case (Fractal Design Torrent Compact without a window) and it's entirely unavailable in North America.

Placed an order with a Polish seller on eBay, received a message that Fedex wouldn't take the package due to size, replied that they could send with any shipping company and that I'm not concerned with shipping speed, after which they cancelled the order on me.


I think the post is pertaining to SBC's, to which mini-pc's threaten the viability of as well.

Something I’ve been thinking about, somewhat related but also tangential to this topic:

The more code gets generated by AI, won’t that mean taking source code from a company becomes legal? Isn’t it true that works created with generative AI can’t be copyrighted?

I wonder if large companies have throught of this risk. Once a company’s product source code reaches a certain percentage of AI generation it no longer has copyright. Any employee with access can just take it and sell it to someone else, legally, right?


In theory, companies are all going to have an increasingly difficult time suing competitors for copyright infringement. By extension, this is also why, IMO, its important to keep AI generated code out of open source/free software projects.

The recent rulings on copyright though also need to be further tested, different judges may have different ideas on what "significant human contribution" looks like. The only thing we know for certain is that the prompt doesn't count.

My guess is that instead of enforcing via copyright, companies will use contracts & trade secret laws. Source code and algorithms counts as a trade secret, so in your example copyright doesn't even matter, the employee would be liable for stealing trade secrets.

AI generated code slowly stripping the ability of a project to enforce copyright protections though is a much bigger risk for free software.


I wonder if an argument could be made that because the LLM came up with the implementation that it’s not a trade secret?

Of course with lease intent is a very important concept. I doubt anyone is getting away with what I described.

It’s just interesting stuff to potentially rethink.


Given trade secrets can't be enforced once they are made public and contracts don't bind anyone who hasn't signed them, it's not a great substitute for copyright.

My guess is companies will simply pretend like generated code is copyrighted, file fraudulent DCMA notices if leaks happen and hope no one decides to challenge them in court.


It becomes less interesting the more the “overweight” stocks correct.

The extreme concentration risk lessens as these 8 stocks fall in value compared to the rest.

I also don’t personally see the risk in the concentration. Risk of what? These companies are legitimately larger and doing more business than other firms.

Pick a median consumer. Which company are they sending more profit to than companies like Apple or Amazon?

10 years ago the average consumer maybe bought an iPhone from Apple every 3 years, so they gave Apple less than $100 of pure profit dollars per year.

Now that same consumer is giving Apple money for the iPhone, but also spending on services that they weren’t buying 10 years ago. If they’ve got an Apple One subscription they’re now sending Apple double or triple the profit they used to get.

These companies are big because they sell more things and are more diversified than they were in the past.

There’s no concentration risk. I’d actually argue that the concentration risk can be resolved overnight through antitrust regulation (e.g., force Apple and Amazon to split into multiple companies, as they already have obvious verticals that could stand alone).


The concentration risk relates to diversification in investing. Index funds are generally thought of as a way to diversify a portfolio. Cap weighted index funds are generally preferred because they are cheaper for the provider to maintain. Compare VOO with RSV for example. VOO is cap weighted. RSV is equal weighted - which means investors in RSV bear the cost of periodically readjusting all holdings so they are once again equally weighted - something no necessary with VOO.

I am not the only investor who has taken steps to offset the overly high concentration in the SP500 that raises the riskiness of an investment portfolio. I've done so by splitting my VOO holdings in half, split 50/50 VOO/VTV that strategically diminishes the impact of the high top 10 stocks in the SP500.


I certainly think it's a good thing to diversify investing, while recognizing that there is value in putting a lot of your bets into heavyweights that are very likely to do very well in the long term.

One of my main points here is that dumping a lot of money into one company isn't always something that represents lack of diversity in your investment dollars.

A company like Microsoft has its hands in so many business verticals that its stock by itself is a highly diverse asset.

I also think it's important to realize that massive companies like these have inherent advantages over smaller ones. A company like Framework literally cannot make a better laptop than Apple even if an angel investor dropped billions of dollars into their laps. Even if they pulled it off, it wouldn't come with a free trial for Apple's content subscriptions and other revenue-maximizing features, and the wholesale price they get from the factory can't match Apple's margins on the device until they convince a large enough mass of people to buy them.

That's the kind of stuff that big companies can do, and that's why they are worth more putting more bets into than smaller ones.

Obviously, companies like Tesla and Nvidia are far bigger risks in the S&P 500, but they represent a small minority of those giants.


There is nothing wrong with your desire to 'dump[ing] a lot of money into one company'. That is easy to do without an index fund. And it is not the investing theory behind the creation of index funds and their investing purpose. When 8 companies dominate an index fund, that means the index is not performing the intended function for which it was created.

But the index fund is doing what it was designed for, which is to index on the companies based on their relative importance in the marketplace.

And that’s really my whole point. Someone who is buying an S&P Index fund wants to own more Apple than GoDaddy, because Apple represents much more economic activity than GoDaddy.


I have read John Bogle extensively. I believe he would disagree with you about the purpose behind why Bogle invented the index fund. Index funds are cap based primarily because that saves on costs (there is no need to rebalance the index). But the philosophical framework is diversification. When 10 companies make the other 490 irrelevant in producing the annual return of the index, the index itself is no longer serving the diversification purpose.

Nobody is going to deny enjoying the monetary gains produced by the index becoming concentrated. But it comes at the cost of the portfolio risk that diversification (i.e. absence of concentration) is intended to eliminate.


I totally get what you’re saying.

I’ll make an analogy to maybe help explain what I mean further:

I own a somewhat diverse set of 50 company stocks, at least for the purposes of this exercise.

Let’s say a bunch of those companies merge, now there’s only 20 companies.

No product lines have been discontinued. The companies make all the same things with the same client lists.

Did my investments become less diverse when these companies merged? Perhaps in some ways yes, in many other ways no.

Is my investment portfolio more diverse if I own one stock, Apple, or if I own three stocks, Time Warner, Paramount, and Comcast? All these companies make media content, but Apple is in more industry verticals overall in addition to being a media company (or at least, we can say they are for the purposes of this analogy). If the content industry collapses, Apple is fine, the rest not so much.


Size and success is not a diversification factor. Investment history is scattered with the bones of 'golden child' companies that never saw the death train coming at them through the tunnel. Intel. Nokia. Blockbuster. Yahoo.

Moreover, your examples are crossing over into active investing versus indexing. Indexing theory submits active investors cannot beat indexing over time (Buffet's purchasing/controlling whole companies notwithstanding).


I'm not talking about size and success, I'm talking about participation in a diverse array of industry verticals.

My example is not meant to specifically talk about active investing, I'm just picking out companies to discuss within a hypothetical index holding.

> Intel. Nokia. Blockbuster. Yahoo.

Interesting, 3/4 of these still exist and are doing reasonably well. If you bought their stocks 30 years ago you'd be up on your investment on all of them except for Blockbuster. Obviously, they're not top performers in that timespan (although Nokia ADR pays dividends like other telecoms so maybe it is a good investment in the right index).

You have inadvertently demonstrated some of my point here: companies that serve diverse verticals stick around for decades. For example, Nokia’s consumer business evaporated but their telecom business is still here. See also: BlackBerry.


I wonder what a solution could look like. Perhaps keep the market cap weighting, but cap the weighting at a max $500b (or some sliding scale to prevent the top X stocks from composing more than Y% of the portfolio)

That would certainly be a way to control escalating concentration but at the expense of keeping index fund costs low. The Vanguard Total Stock Index (VTI) has an expense ratio of 0.03 - almost zero. Low expenses is a critical factor behind why index funds outperform active investing. So, yes, your proposal would work, but the expense ratio would up to implement the cap.

Sounds like some lame ass tech founder bullshit if I’ll be honest.

If I had cancer the last think I’d be thinking would be to make a slide deck about it.

Can these robot people come back down to earth and have a genuine human experience for a chance? Not everything has to be framed in the view of a startup company or a data analysis exercise.

Maybe focus on spending time with your family and friends? If they still like you after years of being an insufferable tech bro.


He’s been public that he’s ten months clear now. Some prefer to accept undesirable circumstances. Others prefer to oppose them. He’s one of the latter. A little paraphrase of Dylan Thomas’ work here is something I’m fond of:

Do not go gentle into that good night

Rage, rage against the dying of the light

And if he’s successful, which hopefully now he has a much better chance of, there are all these new medical results out that are useful.

As an example, a close friend is using one of the personalized medicine companies that sytse’s “CEO of care” has invested in to diagnose a persistent debilitating condition with no specific cause.

Or to quote someone else: All progress depends on the unreasonable man.


That’s so cool.

If you can cure a cancer by framing it as a data analysis exercise that doesn't seem to be a bad thing.

Who hurt you?

Tech bros who are taking my job with AI and trying to fuck up the world for profit?

> Serviceable, repairable, upgradable Macs are officially a thing of the past.

Well, not exactly. Apple’s desktop Macs actually all have modular SSD storage, and third parties sell upgrade kits. And it’s not like Thunderbolt is a slouch as far as expandability.

I can see why the Mac Pro is gone. Yeah, it has PCIe slots…that I don’t really think anyone is using. It’s not like you can drop an RTX 5090 in there.

The latest Mac Pro didn’t have upgradable memory so it wasn’t much different than a Mac Studio with a bunch of empty space inside.

The Mac Studio is very obviously a better buy for someone looking for a system like that. It’s just hard to imagine who the Mac Pro is for at its pricing and size.

I think what happened is that the Studio totally cannibalized Mac Pro sales.


Thunderbolt absolutely is a slouch.

Every PCIe card I have requires it's own $150+ PCIe to Thunderbolt Dock and its own picoPSU plus 12V power supply.

External PCIe is convenient for portables. Not for desktops. It's a piss-poor replacement for a proper PCIe slot.


Why don’t you just get a multi-slot PCIe box?

I could.

It would be even cooler if that box was also housing my computer and powered by the same power supply.

And then the PCIe lanes could just run to the CPU/SoC instead of having to be wrapped in Thunderbolt.


And here I am with a gaming PC that has absolutely nothing in the PCIe card slots except for a graphics card. 1 out of 5 slots filled!

The truth of the matter is that there's basically no hardware on the market that actually depends on PCIe bandwidth besides graphics cards.

Furthermore, us computer nerds don't like to hear it, but making someone open up a computer is a barrier for many customers.

E.g., I'm selling a computer to a video professional and they want a processing appliance that connects to a computer to help with their video workflow.

Is that video professional also skilled in computer subjects like opening up computers? Probably not! A nice box that you plug in to Thunderbolt is way simpler.

Bonus points, you can plug that box into any other computer without taking your whole computer apart.


Apparently the Neo is surprisingly repairable - in that parts can be replaced, not that you can buy stuff at Microcenter or Fry's (RIP) and shove them in.

It's sad that "you can replace the SSD" is in some people's eyes "serviceable, repairable, and upgradeable".

We should demand better of our computer-manufacturing overlords.

> It’s not like you can drop an RTX 5090 in there.

Why not? Oh, right, because Apple won't let you. Sad.


I didn’t phrase myself very well. What I’m saying is that the loss of the Mac Pro didn’t reduce the repairability or modularity at all in the product lineup.

It was exactly as modular as the Mac mini and Mac Studio.

The only difference is that it had some PCIe slots that basically had no use since you couldn’t throw a GPU in there, and because thunderbolt 5 exists.

Yeah, sure, there were some niche PCIe things that two people probably used. Hence the discontinuation.

I am an ex-Mac user, I own a Framework. Don’t worry, you’re preaching to the choir.


> Apple’s desktop Macs actually all have modular SSD storage

"Modular" does not mean that it's serviceable, repairable or upgradable. Apple's refusal to adopt basic M.2 spec is a pretty glaring example of that.


> Apple's refusal to adopt basic M.2 spec

I get the ideological angle, but in practical terms that's not a barrier: https://www.aliexpress.us/w/wholesale-apple-ssd-adapter.html...


Those are all for Intel Macs, and not even the recent Intel Macs. You can't use a passive adapter to put a NVMe SSD into a current Mac like you could a decade ago, because back then the only thing non-standard about the SSD was the connector. Now most of the SSD controller itself has moved to the SoC and trying to put an off the shelf SSD into the current slot makes no more sense than trying to put an SSD into a DIMM slot.

This is the USB-C dongle argument all over again, but with a proprietary connector that a total of one (1) company uses.

Honestly I don't care, but Apples SSDs don't have a storage controller on them, and those adapters are designed to "bypass" the controller on m.2 drives.

You can argue that it's different for the sake of being different, but

A) I personally don't always hold that monopoly is a good thing, even if we agree m.2 is fairly decent it doesn't make it universally the best.

B) I'd make the argument that Apple is competing very well with performance and reliability..

C) IIRC there are some hardware guarantees that the new filesystem needs to be aware of (for wear levelling and error-correction) and those would be obfuscated by a controller that thinks its smarter than the CPU and OS.

if we're talking about Intel era Macs then that proprietary connector predates M.2 entirely and is actually even thinner and smaller (which is pretty important when the primary use-cases is thin-and-lights); though I suppose that the adapter fits is a sign that it would have been possible to use a larger connector...


That is an absolutely awful argument against what I just said. I can tell that you don't care.

Tens of thousands of mini PC and laptop boards ship with multiple M.2 slots. Apple can use both connectors, with the exact same caveats that normal M.2 SSDs have on ordinary filesystems. Apple does not have to enable swap, zram, or other high-wear settings on macOS if they are uncomfortable with the inconsistency of M.2 drives. Now, I'd make the argument that people don't complain about APFS wear on external SSDs, but maybe I'm wrong and macOS does have some fancy bypass saving thousands of TBW/year.

Whatever the case is, "the annoying thing is competitive" was not a justification for the Lightning cable when it reached the gallows. It did not compete, it specifically protected Apple from the competitive pressure of higher-capacity connectors. The same is true of Apple's SSD racket and the decade-old meme of $400 1tb NVMe drives.


I don't buy that argument, "a PC by any other name" is what made intel mac's somewhat uncompetitive when compared to the M-series laptops: which are currently dominating with total vertical integration of the OS and hardware.

Also: All things being equal, the lightening connector was technically superior to USB-C and arrived much earlier.. so it's somewhat on the same path.

USB-C succeeded due to a confluence of;

A) Being a standard people can get behind. (lightning was, of course, much more awkwardly licensed)

B) Lightning never got a sufficient uplift from USB-2.0 performance.

C) The EU eventually killed lightening through regulation.

It was, however, smaller, more durable and (as mentioned) earlier.

I'm totally not against our new USB-C everywhere situation w.r.t. phones, but if anything it reinforces the point: The technically superior thing being too proprietary caused its death (despite being early).


Even without an adapter there are 1st and 3rd party modules available.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: