All spreadsheets used to run in your terminal, in the old days. You can still download one here and I wouldn't be surprised if it still works: https://winworldpc.com/product/quattro-pro/4x
Go to https://copilot.com and ask a question. You can see from the answer that it is clearly for entertainment only. Three years ago Microsoft was considered a leader for having the foresight to invest in OpenAI. Today they are a laggard.
On the contrary, the first companies to acknowledge the new AI Winter will have a big competitive advantage over those still dumping money into the bottomless pit.
I've seen the "success" a non-software company has been having, trying to integrate AI into their processes. A hypothetical competitor who chose not to do so would absolutely be coming out ahead right now.
I can't say whether this trend would continue, but the answer to your question today is "yes".
Genuinely curious, what could an LLM even do for an ice cream shop? Checkout already takes less time than scooping a cone, and it's even quicker with cash. Maybe it could surveil the customers and employees? But I think that will lose you more customers than it gains.
Generally I would expect the ROI to be negative, like we've seen with most corporate AI projects, so yeah any ice cream shop that didn't invest in "AI" is going to come out ahead of one that poured money into the pit.
People still believing in AI being a temporary thing are the same ones refusing to get a mobile phone in the late 90s. Stating it wasn't needed and just a nice to have.
Maybe that's because they don't want people who've never heard of Azure to just let it blend into the wide spectrum of cloud products whereas Microsoft is something almost everyone would recognize.
That's because TFS/VSTS followed the same naming convention where the "S" stood for either Server or Services. Once they rebranded the Azure-backed hosted version Azure DevOps Services, then it no longer really made sense to do anything but rename the self hosted version in the same fashion.
It would have been more confusing to have Visual Studio Team Server and Azure DevOps Services being the same product but hosted differently.
Not just developer tools, reusing trademarks in general.
At one point the next version of Windows Server 2003 was going to be Windows .NET Server.
Also Windows CE, Outlook Express, Xbox App, Xbox Game Pass for PC, Visual Studio Code, Visual Studio for Mac, Microsoft Office Excel, Microsoft Office Word, etc.
Howard R. Moskowitz is an American market researcher and psychophysicist. He is known for the detailed study he made of the types of spaghetti sauce and horizontal segmentation. By providing a large number of options for consumers, Moskowitz pioneered the idea of intermarket variability as applied to the food industry.
Well it depends on what you're talking about. The model names were originally called lambda, followed by palm and then finally gemini. The chatbot product was internally known as meena, launched as Bard, and then transitioned to Gemini once the Gemini model came out.
SAP sales reps used HANA for "cloud" in the beginning... Which was bs back then and is today. But while everybody wanted to be in the cloud, SAP sales was scared to not be with the cool kids, when they do not somehow add to the cloud talk
I don’t think we’ll see Apple actually rename all their apps over it. It’s simply a feature, it doesn’t change what the app is.
Also, Apple tends to make system services that are implemented once and work across all apps I the OS, like with their writing tools. The app didn’t change, it can just take advantage of a new system level feature… and so can 3rd party apps.
A product doesn’t have to have every feature baked into the name.
They could simply have marketing that talked about “<product name>, now with Copilot”. Eventually the marking moves on to the next thing, Microsoft products already became synonymous with Copilot/AI due to the marking and general use, and the names stay clean and consistent over time.
I think this is the right answer. I am frustrated by Copilot and by many aspects of AI, but to me it seems like straightforward branding: you use a Microsoft product, you want to use AI in it, you look for Copilot (name and/or icon).
To me, the issue isn't that they've named so many things 'Copilot' but rather that Copilot is in every goddamn product.
Not if AI is ultimately a commodity, which it likely is. We don't want or need branded terms for other common features, like networking or files. In the early days of networking, before it was standard, there were attempts to brand things like NetBIOS with IPX and such. I don't want to repeat all of that every time some company wants to establish vendor lockin or branding.
Keep in mind that employers have to pay $100,000 in visa fees (in addition to competitive salaries) for each H-1B visa. Clearly these immigrants are not undercutting US workers. It is $100K cheaper to hire a US worker.
And ultimately they have a lot more important things to be doing then learning a different email client than the one they use at their desk on earth. This is an email client on a laptop, not a navigation system.
The mission of the astronauts on board is to test the damn Orion spacecraft in preparation for a human landing on the moon.
> NASA flight controller and instructor Robert Frost explained the reasoning plainly in a post on Quora (via Forbes). “A Windows laptop is used for the same reasons a majority of people that use computers use Windows. It is a system that people are already familiar with. Why make them learn a new operating system,” he reportedly wrote.
Maybe he should have designed the rest of the controls to look like the cockpit of 2003 Toyota Camry. It is a system that people are already familiar with. And actually reliable.
That’s awesome. I’m assuming there’s zero chance it actually gets deployed (for a value of zero that is less than the chance a moon base is actually deployed, also assumed zero) but if it does, apparently the controls will look like this- https://sj.jst.go.jp/stories/2024/s0124-01p.html
> Do you think the US has idle capacity that can be activated at a moment's notice?
I'm sure some very smart MBA increased profits by eliminating spare capacity or making cuts that would make it much harder to spin up. That's American business culture: focus on this quarter or this year, nothing else matters.
STRICT has severe limitations, for example it does not have date data type.
Why is it a problem that it allows data that does not match the column type? SQLite is intended for embedded databases, where only your application reads and writes from the tables. In this scenario, as long as you write data that matches the column's data type, data in the table does match the column type.
> “Developers should program it right” is less effective than a system that ensures it must be done right.
You're right, of course. But this must be balanced with the fact that applications evolve, and often need to change the type of data they store. How would you manage that if this is an iOS app? If SQLite didn't allow you to store a different type of value than the column type, you would have to create a new table and migrate data to a new table. Or create a new column and abandon the old column. Your app updates will appear to not be smooth to users. So it is a tradeoff. The choice SQLite made is pragmatic, even if it makes some of us that are used to the guarantees offered by traditional RDBMSs queasy.
> Why is it a problem that it allows data that does not match the column type? SQLite is intended for embedded databases
I'm afraid people forget that SQLite is (or was?) designed to be a superior `open()` replacement.
It's great that modern SQLite has all these nice features, but if Dr. Hipp was reading this thread, I would assume he would be having very mixed feelings about the ways people mention using SQLite here.
No, I think that people can use SQLite anyway they want. I'm glad people find it useful.
I do remain perplexed, though, about how people continue to think that rigid typing helps reliability in a scripting language (like SQL or JSON) where all values are subclasses of a single superclass. I have never seen that in my own practice. I don't know of any objective research that supports the idea that rigid typing is helpful in that context. Maybe I missed something...
> where all values are subclasses of a single superclass
I don't understand this. By values do you mean a row (in database terms)? I don't understand what that has to do with rigid typing.
Lack of rigid typing has two issues, in my opinion: First, when two or more applications have to read data from a single database, lack of an agreed-upon-and-enforced schema is a limitation. Second, when you use generic tools to process data, the tools have no idea what type of data to expect in a column, if they can't rely on the table schema.
First off, I am so glad the famous "HN conjure" actually worked! My "if Dr. Hipp was reading this thread" was tongue in cheek because on HN it was extremely likely that's precisely what would happen. Thank you for chiming in, Dr. Hipp - this is why I love HN!
So, in case you missed it, you're responding to Dr. Hipp himself :)
> I don't understand what that has to do with rigid typing.
Now I would like to learn a bit from Dr. Hipp himself, so here's my take on it:
Scripting languages (like my fav, Python) have duck or dynamic typing (a variation of what I believe Dr. Hipp, you specifically call manifest typing). Dr. Hipp's take is that the datatype of a value is associated with the value itself, not with the container that holds it (the "column"). (I must say I chose the word "container" here to jive with Dr. Hipp's manifest. Curious whether he chose that word for typing for the same reason! )
- In Python, everything is fundamentally a `PyObject`.
- In SQLite, every piece of data is (or was?) stored internally as a `sqlite3_value` struct.
As a result, a stack that uses Python and SQLite is extremely dynamic and if implemented correctly, is agnostic of a strict type - it doesn't actually care. The only time it blows up is if the consumer has a bug and fails to account for it.
Hence, because this possibility exists, and that no objective research has proven strict typing improves reliability in scripting environments, it's entirely possible our love for strict types is just mental gymnastics that could also have been addressed, equally well, without strict typing.
I can reattempt the "HN conjure" on Wes McKinney and see if this was a similar reason he had to compromise on dynamic typing (NumPy enforces static typing) to Pandas 1.x df because, as both of them are likely to say, real datasets of significant size rarely have all "valid" data. This allows Pandas to handle invalid and missing fields precisely because of this design (even if it affects performance)
A good dynamic design should work with both ("valid" and "invalid") present. For example: layer additional "views" on top of the "real life" database that enforce your business rules while you still get to keep all the real world, messy data.
OTOH, if you dont like that design but must absolutely need strict types, use Rust/C++/PostgreSQL/Arrow, etc. They are built from the ground up on strict types.
With this in mind, if you still want to delve into the "Lack of rigid typing has two issues" portion, I am very happy to engage (and hope Dr. Hipp addresses it so I learn and improve!)
The real world is noisy, has surprises in store for us and as much as engineers like us would like to say we understand it, we don't! So instead of being so cocksure about things, we should instead be humble, acknowledge our ignorance and build resilient, well engineered software.
Again, Dr. Hipp, Thank you for chiming in and I would be much obliged to learn more from you.
Thank you for the great explanation. But SQL isn't as dynamically typed as you suggest. If a column is defined as DECIMAL(8, 2), it would be surprising for some values in that column to be strings. RDBMSs are expected to provide data integrity guarantees, and one of those guarantees is that only values matching the declared column type can be stored.
Relaxing that guarantee has benefits. For example, it can make application evolution easier--being able to store strings in a column originally intended for numbers is convenient. But that convenience can become a liability when multiple applications read from and write to the same database. In those cases, you want applications to adhere to a shared schema contract, and the RDBMS is typically expected to enforce that contract.
It also creates problems for generic tools such as reporting systems, which rely on stable data types--for example, to determine whether a column can be aggregated or how it should be formatted for display.
When your application's design changes, you may need to store a slightly different type of data. Relational databases traditionally require explicit schema changes for this, whereas NoSQL databases allow more flexible, schema-less data. SQLite sits somewhere in between: it remains a relational database, but its dynamic typing allows you to store different types of values in a column without immediately migrating data to a new table.
This flexibility is convenient when only one application reads and writes to the table. But if multiple applications access the same tables, the lack of a strictly enforced schema becomes a liability. The same is true when using generic tools to process data in SQLite tables, because such tools don't know what type of data to expect. The column type may be X but the actual data may be of type Y.
reply