Hacker Newsnew | past | comments | ask | show | jobs | submit | bx_lr's commentslogin

This article is part of a series; glad to see they were aware of the two sports that use roadbooks today:

http://www.core77.com/blog/object_culture/roadbooks_part_3_r...

http://www.core77.com/blog/graphic_design/roadbooks_part_2_p...


The article might tout the idea of cyberweapons slightly too much, but I think Stuxnet indeed qualifies as one.

I'm somewhat worried about these things. The problem I see is that we are becoming even more and more leveraged/dependent on technology. And these technologies are increasingly interdependent. A successful attack on one technology can potentially bring down entire systems down in unanticipated ways.

The recent power outage in San Diego and nearby areas serves as a good reminder. You don't actively think about power, it is something you take for granted. Only when the power is lost, then you realize how dependent everything is on it; traffic lights stopped, ATMs didn't work, credit and debit cards didn't work, freezers and fridges stopped, and so forth. From modern times to the dark ages in an eye-blink, instant paralysis.

I don't think nuclear power plants as targets are that interesting. Just turning off traffic light system would be enough to bring down and entire US urban area down to its knees.

New networks of complex dependencies are being created all the time. The smartphone boom is going to create one, and people will start relying on the existence of it. If iPhone and Android keep dominating the market it will create more homogenous mass of devices, providing a more consistent attack surface and more potential for widespread damage. I don't see how smartphones could avoid the same problems PCs were/are experiencing. Waiting for the first smartphone "UNIX worm".

Wireless features are getting added to cars. Yet another potential complex network. War-driving could soon get completely new meanings.


150 years is still far from 969 years ;-)


Well, sure, but 150 years gives you about 75 years of extra research time.


It would be an interesting situation for Nokia. First they announce that the company will be bet on Windows Phone, Symbian gets axed, and MeeGo will be put to the background. Many MeeGo developers saw the writing on the wall and abandoned the ship. Hiring them back would be difficult...


Could you describe your experiences a bit more? Buenos Aires seems to be one of the more interesting places in South America. Based on the typical news about South America (or Mexico), just landing on that continent will get you kidnapped, abused by the police, robbed, mugged, cheated, etc.


>Based on the typical news about South America (or Mexico), just landing on that continent will get you kidnapped, abused by the police, robbed, mugged, cheated, etc.

That's like saying, "Based on the typical news about Miami (or Anchorage), you will get eaten by polar bears and die of hypothermia".

Mexico is not South America, and it's nowhere close to Argentina. The distance between Juarez and BA is about the same as the distance between London and Seoul. Seriously. Worlds apart.

You can't generalize South America. It's bloody ginormous, and has far more cultural diversity than North America. It's far from homogenous. There are some places in South America that are dangerous. Most aren't. Similarly, you don't judge San Francisco by Detroit.

I've travelled through Argentina, Bolivia, Peru, and Ecuador. Buenos Aires is basically Paris or Madrid, but a little run down and depressing (much better night life, though). Bolivia is mostly alpine desert in the south and jungle in the north, and almost all poor. Peru ranges from modern metropolis (Lima) to shacks on a beach (Mancora), to mountains, jungles, etc. Ecuador is the only place I ever felt unsafe, and only in parts of Quito and Guayaquil. The worst places for crime and personal safety were apparently Venezuela and parts of Brazil.

Anyways, the point is you're trying to shoehorn an entire continent with dozens of countries and hundreds of cultures into a single mental schema, and that's just a dumb thing to do. It would be like saying "Asian culture" and thinking you could fit China, North Korea, Japan, Thailand, and India in one label.


Firstly, I didn't suggest Mexico was part of South America. Secondly, I'm not trying to shoehorn South America under a single mental schema. I'm actually trying to do the opposite, i.e. figure out what South America really is. Doesn't seem I can do that based on news, articles, whatever, and rather have to tour the place myself to come to a conclusion.


You can talk to people who've been there, but each country is pretty different and there are major differences within countries, as well.

As far as the violence goes, it tends to be cyclical. Certainly in parts of Columbia FARC activity is still a major issue, and Venezuela is bad for petty crime and assaults. You get the occasional Shining Path incident in Peru, but it's very low-level. Bolivia has regular strikes that bring the riot police out in La Paz, but the rest of the country seemed perfectly safe. Never had a problem in Argentina, or heard of anyone having problems in Chile, Uruguay, or Paraguay. Brazil is mostly kosher, but I heard of a few people getting robbed, or more likely picking up a girl from a club and getting mugged after.

But seriously, do go down if you get a chance. I had a blast there.


Not that you asked me, but living here i wanted to share my two cents. Buenos Aires is not tragic at all, it's... just a little melancholic. A tiny London with poor people and too much humidity, but with great people, lovely architecture, and well, there's insecurity, no way to deny it, but it all depends on where you go. You can walk, travel, go out, party, do whatever you want and you won't feel paranoid or anything. Oh, and there aren't as many escaped nazis here as in the movies, either.


Thanks :-) Comparison to London makes me feel better about Buenos Aires. Puts thing into a context. I actually used to live in North London, in a semi-rough area. Even though London was an expensive mess and certain areas gave the feeling of insecurity, I still liked it...at least it was an interesting mess.


It can be done in SW and it removes the need to implement it in hardware. Most probably nothing is preserved over a stand-by in a CPU, and when a CPU comes back online, it needs to be set up from scratch.


Quite a bit of stuff is done in assembly:

- Low level processor set up: MMU, TLB, caching etc.

- Early stage boot code.

- Using instructions that are not normally accessible in C. For example, SIMD instructions, count leading zeroes, some bitwise operations.

- Interrupt handlers, interrupt masking.

- Entry and exit to low power modes.

- Entry and exit to hypervisor and other such virtualized modes.

- Sequences to turn on/off MMU.

- System calls.

- Locks for mutual exclusion, critical sections.

- Instruction level optimization in some algorithms.

- Anything that requires stack control or setup (packing arguments, green threads, etc).

- Machine code generation in compilers.

Some people prefer inline assembly and a lot can be achieved by C macros and inline assembly. Personally I prefer naked assembly functions in .s file, it is more readable and requires less tricks.

I rarely see assembly being used for performance. Most of the time the use of assembly is limited only to hardware level interaction that is normally not possible with C.

Even though writing assembly is not that common, the need to read is:

- Debugging on-target code with hardware debugger without having symbols and source available.

- Low level crash dumps (CPU context).

- Staring at disassembly in general.


That future might be happening already. Both AMD and Intel have products that pair CPU with GPU, ARM has Mali, NVidia's Tegra has on-chip GPU.

I think discrete GPUs will become niche products in the future. Once mainstream GPUs are on-chip, the variety of different GPU architectures will probably be reduced. The next step might be a standard ISA for GPU.

It is hard to say where GPUs will be in three years, but at least the industry is getting interesting again. It has been more of the same for so many years in discrete GPUs, but now the on-chip GPUs are potentially game changing.


Even though your vision is probably right, I'm not entirely happy with it. I love choice, I love being able to choose a certain processor and GPU and upgrading one of them after a year.

I'm probably the minority though, so business wise it makes sense.


It seems you are in a situation where I was a long time ago. I also was frustrated, because the CEO had the same attitude, the start-up never started making money, low pay, no savings.

I was thinking about quitting for a long time, but didn't do so. I felt loyalty, I was green, I was the developer guy, I didn't really know how businesses are created, yadda, yadda. At the same time I felt that things were not right, we were not going to the right direction.

After some years I decided to quit. Being broke was the last straw. I was also becoming toxic, a hostage, I had to quit to save myself and the start-up.

It was a bit hard to find a corporate job, because I didn't develop any good connections to the industry during my start-up time. Eventually I landed a job in a big company. It wasn't the best of jobs, but got me started in my current career. After two years I got a call from a manager that quit from that company. That's how I got my next corporate job. Then after three years, the same manager called me again and I got my current corporate job. And now I called that manager and some other ex-colleagues to offer job opportunities. You get the picture.

In hindsight quitting was the best decision I ever made. Two years after I left the start-up lost funding and went bust. Steady income improved my life significantly, I paid off debts, started saving, travelled, had real vacations. Almost all the stress and pain in my life was gone.

The only thing I regret is that I didn't quit earlier. I didn't believe in the start-up, but I kept hanging around, I was wasting my time. Since this experience I have made the decision to trust my gut feelings. If I feel something is wrong, I'll trust myself. So far this has served me well. I keep large savings. The idea is that if I ever need to make a jump to the void, then I'm able to do it.

My advice based on my own experiences:

- If you are finished, get out, because you are wasting your time and potentially other people's time.

- To maximize your chances to get a job in a big company look for a good match. When I quit I applied to all sorts of interesting jobs, but never got a reply. I did graphics programming, and unsurprisingly I was eventually hired by a company that desperately needed a graphics programmer.


Notice the following "AMD says it maintains a start-up mentality even though it's a large company".

This is becoming a trend now, hardly the only company touting this. My current company is doing the same. My old company approached me recently and told me are going to start working in start-up mode in California, and this is an old and rigid company.

I'm holding Paul Graham, Joel Spolsky and others accountable for this. They have glamourized start-ups so much that these big and established companies have started feeling bad about themselves. Or maybe they have finally realized that big organizations and rigid processes don't work very well.

There are probably upsides to this. But the downsides are killing me. Obviously for large company working in start-up mode means that engineers work even harder, are more accountable and have to deliver to even more ridiculous schedules. And the result is even more chaos and avoidance of work and responsibility, more panic in QA, and so forth.

Personally I don't like this trend. If I want to work start-up hard, I will join a start-up and potentially become rich in the process. Working in start-up mode in a big company has so far meant that I work through weekends and don't get paid for that.


This is such an annoying new trend to claim one's company is in "startup mode" when it is all just superficial or even straight out bullshit; it has become quite the trend in NYC now.

One place I declined to work at claimed to be in startup mode yet their offices looked like those of a financial company (which they were not), everyone seriously dressed up, and the code base and infrastructure was pretty big and inherited from the parent organization.

A place I used to work at is also going through the motions of "becoming a startup" -- a company that has not been an internet startup for over 10 years. Apparently being a startup means rewriting the whole codebase in Java, offering free snacks and drinks, tearing down the walls from the cubicles (say goodbye to ever getting "in the zone" when coding), and calling daily standups and using JIRA "being AGILE". Sticking feathers in ones butt does not make one a chicken...


I like this trend. As a developer, I always try to spend all time on tech stuff, not on overhead of management of big company. Of course, I don't want to work OVER TIME.


What does jira have to do with it? It's just an issue tracker like any other.


Nothing wrong with JIRA or Greenhopper but its use, by itself, does not make a group "agile".


Check for Java, standups, JIRA, snacks, etc.

But only free espresso -- I should forward your comment to my boss, so we get more drink options. :-)

(A Python guy just started and makes predictable noises about rewriting thousands of pages of working code from a scripting language with very similar capabilities... That will probably happen simultaneously with free Coca Cola being introduced.)


that scripting language is Perl or PHP?


Perl (sorry for sleeping before answering :-).

I've pointed the Python guy to Moose, "Perl Best Practices", etc. Didn't help. Sigh... It is sad when people don't stop the language wars garbage after turning 20. :-(

At least he didn't argue to replace the toilet paper with bills...

In my anecdotal experience, this seems to be a not totally unusual Python thing. (One True Way, etc.)

(And before someone starts arguing about that it might be motivated with a rewrite in some cases -- they guy is so new, he doesn't know the particulars for that area.)


If so, they seem to have appropriated the 1998 definition of "Startup".

Hiring 1000 people at once doesn't seem like the best way to maintain your startup's corporate culture, but it does seem like an effective way to burn through $250M in funding. All they need is the fooseball table in the middle of the dev space and they'll have that oldskool dotcom vibe nailed.


So find another job or another career. I'm not being mean: there are still jobs that are 40 hours per week. If those jobs don't appeal to you, switch careers and find a work-life balance that does.

People get exploited because they don't act on their other options. I work 40 hours a week.


I was just pointing out a trend. It seems that more and more companies are trying to appeal to the start-up types. I'm not too optimistic that these big-corp-in-start-up-mode ideas will work, most people are not up to the task. I view this kind of mode as an exception not the norm.

Personally I don't mind working hard. In my case the work has been interesting enough and I'm viewing the long hours as an investment to acquire certain technology experience. At the same time I do feel an increasing urge to go and start working for real start-ups again.


Companies are trying very hard to attract the talent that do not want to sit head-down in a cubicle for 8-10 hours a day ina tomb (no conversation) like development department, and come up with the term, "startup" even if a 40 hour work week exists. Like the catch phrases Agile (true Agile, creative environment) etc. But, some programmers love that environment :D It is all fit.

Though, the companies established like AMD, Amazon and Netflix, for example, do require the insane hours (if not blatantly then tacitly).


I actually think the explosion of software engineering salaries (led by VC-funded SF startups) is responsible for this work attitude, not just glamorization of startup life.

Larger companies (and even smaller companies that aren't ultra-rapid-growth) have to pay much more to compete on salary, and as a result, they now have to get much more productivity out of each employee than before. Hence, more accountability, more ridiculous schedules.


Sounds similar to the event that leads many companies claiming that they're "Agile".

Someone I knew interviewed a candidate and asked what he knows about Agile. He responded by telling a story of what happened in his previous workplace: "We have a 1 hour status meeting everyday, the management decided to apply Agile/Scrum. The easy way to do this is to change the name of that 1 hour status meeting everyday to become Scrum Meeting".


I agree there is a trend but I think it has more to do with the idea that you will be making something new not that you will need to work a million hours to do it. I've never needed to work a lot of hours to get my projects done and I always avoid companies where people seem to work too many hours.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: