I have tried Julia few times and each time this is what stops me. I don't understand why they decided to make indexes R-like oposed to "all other languages"-like. My brain just stops working when i need to recalculate indexes and each time I'm wrong.
Some people may not realize it, but when it comes to programming languages, ergonomics matter—a lot.
That's just you being unable to let go of your biases? In languages with pointers giving primacy to offsets makes sense. In every other context aligning with natural language and math (first, second, 1..n inclusive) is perfectly logical.
I don’t think that any Julia program I’ve ever written would need to change if Julia adopted 0-based tomorrow. You don’t typically write C-style loops in Julia; you use array functions and operators, and if you need to iterate you write `for i in array ...`.
“ergonomics matter”
Definitely. Ergonomics is the main reason I enjoy Julia. Performance is a bonus.
And 1-based indices are much more natural and ergonomic to everyone that doesn't pretend to be a machine. When I think about n-th element of a vector, its offset is not the first thing I am interested in.
We had this discussion on here recently. It's really puzzling to me. Julia has the most ergonomic array interface I have worked with by far. How did 1 based indexing ever trip you up?
I’ll never get why people hype up Zed. Sublime Text already has all the same perks—and beats Zed at the very things it claims to improve. Sure, it might not have every advanced feature, but for “vibe coders” who don’t need a full IDE and just want to skim or tweak generated code, Sublime Text is the better choice.
Someone already mentioned the hoarderware issue, which is big for me, so I'll give my other concern.
Years ago on Twitter I believe it was lcamtuf that asked "Would you pipe a text file into less?" and Dan Kaminsky (RIP) replied -- "Not now that you asked if I would, no." The obvious implication is that people largely didn't think of simple text parsing utilities as places of concern for security issues, but that is not really in line with reality. I work with crypto and it seriously matters if I got owned in that I can lose amounts of money entrusted to me that I could never hope to recover or repay. I believe it is a basic fiduciary duty to use as much code as possible written in safer languages. Sublime Text is a massive C++ app and I can't look at the code. I am going to preferentially treat the Rust app as better. There's plenty of CVEs in editors. If I could I would replace every binary written in an unsafe language on every machine I ever use.
My editor touches every bit of infrastructure I have. I use it every day to change the behavior of production machines. I have no choice to treat my editor as trusted. So it needs to be trustworthy to the maximum degree possible.
I truly appreciate your perspective. It’s a very sobering reminder of the responsibility that comes with building a tool that handles code and data.
To be honest, this project is currently in a phase of personal hobby, self-improvement, and self-satisfaction. I must admit that it is not yet ready for mission-critical work.
While I certainly haven't included any malicious code, there are real risks: the app could crash and lose data, or underlying libraries might have vulnerabilities that I haven't had the capacity to fully audit yet. As I’m still experimenting with the architecture alone, I’m not ready to open the source code just yet.
However, your feedback makes me realize how important the "trustworthiness" is for a professional tool. If there is a clear demand for this kind of software and many requests for it to be open-sourced, I would definitely love to consider it as the project matures.
Thank you for sharing such a serious and important viewpoint.
I'm not sure I agree with this. 10x more tokens means leaaving the agent to work for 10x longer, which may lead to bugs and misintepretation of the intention. Breaking the goal into multiple tasks seems more efficient in terms of tokens and getting close to the desired goal. Of course this means more human involvment, but probably not 10x more.
There is no way for this to be true. I read his book about vibe coding and it is obvoius that it has significant LLM contribution. His blog posts though are funy and controversial, and have bad jokes, and he jumps from topic to topic. Ha has had this style like 10+ years before LLMs came around.
If i remember correctly Feynman said in one of his lectures that we know the mass of the electron with much greater precision than the proton, which may mean that it electrons are easier to study. I don't know if this is still true though.
I have quetion - slightly off topic, but related. I was wandering why is pyhton interpreter so much slower than V8 javascript interpreter when both javascript and python are dynamic interpreted languages.
JavaScript is JIT’ed where CPython is not. Pypy has JIT and is faster, but I think is incompatible with C extensions.
I think Pythons threading model also adds complexity to optimizing where JavaScripts single thread is easier to optimize.
I would also say there’s generally less impetus to optimize CPython. At least until WASM, JavaScript was sort of stuck with the performance the interpreter had. Python had more off-ramps. You could use pypy for more pure Python stuff, or offload computationally heavy stuff to a C extension.
I think there are some language differences that make JavaScript easier to optimize, but I’m not super qualified to speak on that.
> I would also say there’s generally less impetus to optimize CPython
Nonetheless, Microsoft employed a whole "Faster CPython" team for 4 years - they targeted a 5x speedup but could only achieve ~1.5x. Why couldn't they make a significantly faster Python implementation, especially given that PyPy exists and proves it's possible?
Pypy has much slower C interop than CPython, which I believe is part of the tradeoff. Eg data analysis pipelines are probably still faster in numpy on CPython than pypy.
Not an expert here, but my understanding is that Python is dynamic to the point that optimizing is hard. Like allowing one namespace to modify another; last I used it, the Stackdriver logging adapter for Python would overwrite the stdlib logging library. You import stackdriver, and it changes logging to send logs to stackdriver.
All package level names (functions and variables) are effectively global, mutable variables.
I suspect a dramatically faster Python would involve disabling some of the more unhinged mutability. Eg package functions and variables cannot be mutated, only wrapped into a new variable.
See Smalltalk, Self and Common Lisp, and you will find languages that are even more dynamic than Python, and are in the genesis of high performance JIT research.
> why is pyhton interpreter so much slower than V8 javascript interpreter when both javascript and python are dynamic interpreted languages.
Because JS’s centrality to the web and V8’s speed’s centrality to Google’s push to avoid other platform owners controlling the web via platform-default browsers meant virtually unlimited resources were spent in optimizing V8 at a time when the JS language itself was basically static; Python has never had the same level of investment and has always spent some of its smaller resources on advancing the language rather than optimizing the implementation.
Also, because the JS legacy that needed to be supported through that is pure JS, whereas with CPython there is also a considerable ecosystem of code that deeply integrates with Python from the outside that must still be supported, and the interface used by that code limits the optimizations that can be applied. Faster Python interpreters exist that don’t support that external ecosystem, but they are less used because that ecosystem is a big part of Python’s value proposition.
First is the Google's manpower. Google somehow succeeds in writing fast software. Most Google products I use are fast in contrast to the rest of the ecosystem. It's possible that Google simply did a better job.
The second is CPython legacy. There are faster implementations of Python that completely implement the API (PyPy comes to mind), but there's a huge ecosystem of C extensions written with CPython bindings, which make it virtually impossible to break compatibility. It is possible that this legacy prevents many possible optimizations. On the other hand, V8 only needs to keep compatibility on code-level, which allows them to practically switch out the whole inside in incremental search for a faster version.
I might be wrong, so take what I said with a grain of salt.
Don't forget that there was a Google attempt at making a faster Python - Unladen Swallow. It got lots of PR but never merged with mainline CPython (wikipedia says a dev branch was released).
keep in mind that, apart from the money throw at js runtime interpreters by google and others, there is also the fact that python - as a language - is way more "dynamic" than javascript.
Even "simple" stuff like field access in python may refer to multiple dynamically-mapped method resolution.
Also, the ffi-bindings of python, while offering a way to extend it with libraries written in c/c++/fortran/... , limit how freely the internals can be changed (see the bug-by-bug compatibility work done for example by pypy, just to name an example, with some constraint that limit some optimizations)
> python - as a language - is way more "dynamic" than javascript
Very true, but IMO the existence of PyPy proves that this doesn't necessarily prevent a fast implementation. I think the reason for CPython's poor performance must be your other point:
> the ffi-bindings of python [...] limit how freely the internals can be changed
See Smalltalk, Self and Common Lisp for highly dynamic languages with good enough JIT, the first two having their research contributed to Hotspot and V8.
Yeah, I don't see how Python is fundamentally different from JavaScript as far as dynamicism goes. Sure Python has operator overloading, but JavaScript would implement those as regular methods. Pyrhon's init & new aren't any more convoluted than JavaScript's constructors. Python may support multiple inheritance but method and attribute resolution just uses the MRO which is no different than JavaScript's prototype chain.
Also Python has a de facto stable(ish) C ABI for extensions that is 1) heavily used by popular libraries, and 2) makes life more difficult for the JIT because the native code has all the same expressive power wrt Python objects, but JIT can't do code analysis to ensure that it doesn't use it.
Even though Javascript is quite dynamic, Python is much worse. Basically everything involves a runtime look-up. It's pretty much the language you'd design if you were trying to make it as slow as possible.
For quite meditation types, perhaps, but mindfulness meditation, for example, doesn't require stripping away nuisance, but acknowledging it while keeping a mindful state instead.
Yes, but I suppose there shoudln't bee too many things happening in order to acknolege them, preferebly only the things in you mind.
I've been trying somithing similar, but more active - beach walks in the in the early eavning. there are still people there but not too many. my goal was to acknoledge everything and enjoy the moment. i was not quite successfull though, it was still too much for me to acheive tranquility :)
I used an Akai LPK25 with my iPhone (using the Camera Connection Kit and a combined usb hub/dac) and an app called Simply Piano. They make a wireless version of that now that would simplify the setup a great deal. It is a mini keyboard and the keys are quite small, but in my experience it was fine for the beginner stuff (and the keyboard is useful in general later). As I said before, I stuck with this until the app started using keys outside the range I had.
Now, as for "did I proceed with more serious learning" - I alternate though a ton of hobbies. So I moved on after that, though still go back to it from time to time. But I also have other musical interests and it was helpful to those as well.
Also did a lot of music on the commute on my iPhone with Korg Gadget (and Caustic before that). Sometimes with a keyboard, sometimes without.
reply