Quick plug for Dreamhost, who have partnered with LetsEncrypt and offer free SSL certificates with their standard web hosting product, easily setup from the control panel: https://www.dreamhost.com/hosting/ssl-tls-certificates/
This is despite the fact they previously made money from selling certificates, and still will take your money for doing so if you want.
(Not an employee, just a satisfied customer who now has free HTTPS hosting etc..)
Thanks for that, I've been on commodity hosting for a while (site5, actually been very happy with them), but they don't offer lets-encrypt, so this is quite attractive.
Think that's part of it - a substantial amount of the web is on shared hosting (and probably dictated by other areas of the business) and as such wont have any level of root access. Therefore the hosting company can control cert installation and thus cost.
Usually add-ons are administered by the hosting company though (?) - so if they want to own the cert purchase/installation flow, then they can certainly do that.
If you have a shell and public I address, you can get a certificate. Root access is only required for Http authentication, you can also authenticate to LE using DNS. I actually just learned about the DNA ability.
Check out Lego project, it makes DNS Auth very easy :)
These hosting companies are already doing enough things "wrong" that if "well, everybody just stop using them" were going to be a viable strategy it would have worked by now for those other things.
We currently do not have the power to change the behavior or the market share of these hosting companies in any significant way. That leaves working around their behavior as the option.
Who said anything about doing it to change their behaviour? Do it because you aren't getting value for money! If there are better alternatives, then use them.
Not if you have user-generated subdomains, or have subdomains that you don't want to advertise in subjectAltName, or simply just want the full power you used to have with HTTP to use any subdomain you want -- then you'll need a wildcard certificate, which Let's Encrypt won't offer you. Nor will any other free CA, except for CAcert, which browsers will not trust.
The absolute cheapest I've found one of those for is $95 a year. So basically, more than all of my server hosting, my domain hosting, and my domain privacy combined, all to sign my CSR that has a one-character change in it.
Of course, if you want both yourdomain.com and www.yourdomain.com (because just *.yourdomain.com won't match the former), then that will, of course, cost extra. A lot extra.
I've heard it's possible, but it would be a substantially increased maintenance burden. You won't be able to put all your subdomains into the subjectAltName section of one certificate, and so you'll need to constantly reload your nginx/Apache configurations to reference all of the different certificates. Plus Let's Encrypt certificates expire every 90 days, and you will have to update all of your certificates before that happens.
I really can't for the life of me understand why they won't offer a wildcard certificate, if you prove you are the owner of the base domain name. The cynic in me expects it's the same reason every CA in existence charges four to ten times more for a wildcard certificate that literally costs them nothing more to make. That entire market would vanish overnight if Let's Encrypt offered them for free. I would even immediately start using them for my site.
I suspect it's both a policy question and the fact that there's not really much of a consensus on wildcard validation in ACME (i.e. how to approach it, and whether it should be a topic for ACME or just a policy decision for the CA).
In terms of policy, wildcards encourage some users to both reuse the same certificate for multiple services (i.e. mail, website, api, etc.), and use certificates that are "broader" than needed (use wildcards everywhere because they're "easier", despite the fact that you only need a certificate for imap.example.com). This increases the impact of Heartbleed-like vulnerabilities significantly (in that unrelated services using the same key are suddenly all vulnerable to MitM attacks). It might not be the worst idea to give the ecosystem some time to get used to non-wildcard certificates in order to discourage that behaviour.
I think there's a good chance we'll see wildcard support sooner or later.
I would welcome key usage constraints to limit a wildcard cert to https only.
But yeah I do hope you're right. I'll switch my domain off self-signing the moment a trusted CA offers a free wildcard cert with elliptic curve signing.
Cloudflare works at the DNS/routing level. You can use their layer to communicate via HTTPS with the browser. The connection between your site and Cloudflare won't be encrypted... which is a bit of an antipattern (as discussed elsewhere).
I once tried to do Let's Encrypt or any free SSL with my namecheap domain and apparently you can't. Is that true? I don't really want to change domain name companies because I'm lazy and it's probably time consuming to get right... can I get free HTTPS if I host my site on Netlify and just point my URL there?
All I use it for is my Jekyll-powered personal site on GitHub pages. I don't mind not using GitHub for it anymore, just want it to work with SSL.
In order to do this, you'd need to put an SSL endpoint between your github page and the client. It could probably be done but it sounds like a no-no.
For instance, I'll bet that if you set it up so that your namecheap DNS A entry was pointed to your own box that had nginx/haproxy/cloudflare/whatever handling SSL decryption (and certs) and then backended to your github pages, it would work, but I'm not a fan of the idea.
GitHub doesn't support custom SSL for pages. A workaround is to change your DNS provider (which is probably Namecheap right now) to Cloudflare and use Cloudflare's free SSL (so it essentially will be an SSL proxy).
You can do this wile still maintaining Namecheap as your registrar, and it's totally free.
Domain registrar has nothing to do with the hosting. As long as you can point a domain to an IP address you can get a certificate.
Namecheap even goes a step further and has an API for domain validation (makes LE certificate authorization easier) so they are very friendly in regards to Let's Encrypt.
Let's hope that the pressure is high enough for these hosts to rethink their strategy. If your website has a visible warning and search engines push it back just because of your host wants more money for a certificate that you can get anywhere else for free, you are likely to switch.
This is a common reaction I see from a lot of people. Tech is hard, but the truth is there are a lot of very smart people working on making it easier for everyone.
When something many people/companies do/use/have feels hard or out of reach, I think it's important to start asking different questions and look around for real solutions.
Small child have to learn language from nothing. They just figure it out through exposure and practice. Even pets learn some language. This is the model to emulate.
Ultimately language use requires a few skills:
* a good parser
* motor cognition/coordination
* a good memory
* semantics/context
* vocabulary
* situational awareness
The first two in the list are what small children struggle with the most. Fortunately, we can eliminate motor coordination as a need for AI. Although extremely powerful parsers demand a specialized expertise to produce this part of the problem is straight forward. I write open source multi-language/multi-dialect parsers as an open source hobby.
I discount vocabulary and situational awareness, because most children still haven't figure this out until they enter high school long after they have learned the basics of speech. That pattern of human behavior dictates that while it might be hard to teach these skills to a computer you can put this off a long ways down the road until after basic speech is achieved.
If somebody paid me the money to do this research my personal plan of attack would be:
1. Focus on the parser first. Start with a text parser and do audio to text later. Don't worry about defining anything at this stage. When humans first learn to talk and listen they are focusing upon the words and absolutely not what those words mean.
The parser should not be parsing words. Parsing words from text is easy. The parser should be parsing sentences into grammars, which is harder but still generally straight forward with many edge cases.
2. Vocabulary. Attempt to define words comprising the parsed grammar. Keep it simple. Don't worry about precision at first. Humans don't start with precision and humans get speech wrong all the time. This especially true for pronouns. Just provide a definition.
3. Put the vocabulary together with the parsed grammar. It doesn't even have to make sense. It just has to have meaning for words and the words together in a way that informs an opinion or decision to the computer. Consider this sentence as an example: I work for a company high up in the building with a new hire that just got high and gets paid higher than my high school sweetheart.
4. If the sentence is part of a paragraph or a response to a conversation you can now focus on precision. You have additional references from which to draw upon. You are going to redefine some terms, particularly pronouns. Using the added sentences make a decision as to whether new definitions apply more directly than the original definitions. This is how humans do it. These repeated processing steps means wasted CPU cycles and its tiring for humans too.
5. Formulate a response. This could be a resolution to close the conversation, or it could be a question asking for additional information or clarity. Humans do this too.
6. Only based upon the final resolution determine what you have learned. Use this knowledge to make decisions to modify parsing rules and amend vocabulary definitions. The logic involved is called heuristics.
This only way all this works is to start small, like a toddler, and expand it until the responses become more precise, faster, and more fluid. At least.... this is how I would do it.
It depends a bit on what you are trying to achieve but I think hooking neural type networks together to simulate human mental faculties might be a better way forward. For instance much of human thinking seems to work around visualizing things in 3d space so you can say to someone imagine a dog on a skateboard on top of a hill and you give it a push, what happens? Once you've got that kind of stuff working with spatial awareness, cause and effect and so on using neural type processing I think the language understanding would come fairly naturally.
You have some good points, but this naive approach of handcoding "cognitive modules" was tried many times in 20th century, and it didn't work at all.
But look at what Deepmind does: it takes these ideas (and also ideas from systems neuroscience), implements them as differentiable modules and trains them on data in end-to-end fashion. This works really well.
Learning is very important, much more important than architecture. If you have a model that can learn you can add more structure later - again this is what modern deep learning is all about.
Since I had starting looking for an alternative to NPM I have discovered a couple of things:
* All current package managers are either language or OS specific. What if you have an application with code written in multiple languages?
* NPM didn't have any kind of integrity checks for its packages, and I assume most package managers don't either. If you download a corrupt package, for example, you won't have any idea and it will still install.
* Some package managers do better than others with regards to managing packages. I found NPM encourages dependency hell and very little management tools for dependent or installed packages.
* A lot of package managers seem to intermix packaging, distribution, and a registry. The registries tend to have limited names to pick from (like real estate) and can result in legal problems. Also if registration to the service catalog is required you cannot self-host or self-manage the distribution of your application.
> * All current package managers are either language or OS specific. What if you have an application with code written in multiple languages?
guix and nix both work cross-language and cross-distro. Still OS specific though since only linux AFAIK. Also, containers partially solve this problem.
> * NPM didn't have any kind of integrity checks for its packages, and I assume most package managers don't either. If you download a corrupt package, for example, you won't have any idea and it will still install.
Any package manager that doesn't do integrity checks is a bad package manager. The only one I know of currently that doesn't is npm, but I haven't looked deeply into every available package manager.
> * A lot of package managers seem to intermix packaging, distribution, and a registry. The registries tend to have limited names to pick from (like real estate) and can result in legal problems. Also if registration to the service catalog is required you cannot self-host or self-manage the distribution of your application.
What package managers don't let you self host? I'm truthfully not aware of any. Even NPM does according to a quick google.
From your readme: "biddle is inspired by the incredible awesomeness of NPM". Since NPM is literally the worst package manager I have ever used, that line makes me want to stop reading and never touch biddle. I'd word it differently.
Edit: Reading biddle further. Dependency management and central hosting are some of the primary reasons to have a package manager. At least for me, that kills any interest at all. I imagine there's a niche market though?
I think it's not bashing of npm specifically so much as it is the node ecosystem it serves and depends on; at least in my mind it's difficult to separate node from npm. That said, for what it's trying to do (read a list of deps, resolve vs. registry, download and unpack) it seems to do a fine job of it.
My major complaint about npm is the choice to allow version range operators on dependency declarations. We know the node.js ecosystem places a high value on composability, so using lots of tiny modules which themselves depend on lots of tiny modules is the norm. This is a problem though because range operators get used liberally everywhere, so getting reproducible builds is like winning the lottery.
There are other things I don't like about using npm: node_modules/ is big and has a lot of duplication (even with npmv3), it's pretty slow, historically it has been unstable, its still crap on Windows, etc. - but for someone who has 'ensures reproducible builds' as part of their job description, the way its modules get versioned is its worst feature.
For reproducible builds (or at least 'to get the same versions again') you should be using 'npm shrinkwrap'. (Of course there's probably more you should do to get true reproducible builds, but that goes for any package manager).
The range operators are important, else you'd never be able to resolve 2 packages that want a similar versioned sup-dependency e.g. jquery 1.12 because without range operators those 2 packages would have declared minor version differences (1.12.1 and 1.12.3) depending on when they were published. This would mean you'd always end up with duplicated dependencies.
I'd argue 'node_modules is big' is not a fault of npm. If the package or app you're trying to install generates a large node_modules dir, that is something you should take up with the package maintainer. See buble vs babel - buble has a way smaller dep tree.
npm is only slow in the ways that all other package managers are, when installing large dependency trees or native dependencies (like libSass) and it is way faster than say pip and rubygems in this regard. When I 'pip install -r requirements.txt' at work, I literally go and make a coffee.
Also never experienced any instability, though I may have been lucky. Certainly it has been very stable for the last year or so when I've been working with a lot. Could you elaborate on why it is crap on Windows? I did think all major issues (e.g. deep nesting problem) were now fixed ...
The main problems we ran into with shrinkwrap were:
It shrinkwraps everything in your current node_modules directory.
This includes platform specific dependencies that may not work on other platforms but now will cause npm install to fail instead of just printing a message about it.
So our current workflow has to be:
1. Update package.json
2. rm -rf node_modules/
3. npm install --production # This doesn't include any of those pesky platform specific packages
4. npm shrinkwrap
5. npm install # Get the dev dependencies
As far as the other comments about npm, I just generally have more problems with it than rubygems/bundler and the general OS package managers.
Shrinkwrap is ridiculous. I'm expected to go look at every resolved dependency and individually add them if I want to update or not? No thanks; one app at my workplace defines ~50 top level dependencies, but this balloons to almost 1300 - and this is with npm v3 - after npm install. Ain't nobody got time for that.
Deep nesting is not 'solved' it just doesn't happen 100% of the time anymore. If you have conflicts, you still have deep trees. I suppose range operators help with this a little, but looking at what gets installed it doesn't seem to help that much; I still have duplicated dependencies.
I was mentally comparing npm to tools like maven, ivy and nuget, all of which are faster but also not interpreted. Not a fair comparison I guess.
> Shrinkwrap is ridiculous. I'm expected to go look at every resolved dependency and individually add them if I want to update or not?
Not sure you're aware of the suggested flow (see here [1]), but it isn't ridiculous. Use 'npm outdated' to see which packages are out-of-date and 'npm update --save' to update a dep (and update the shrinkwrap file).
Keeping track of stale sub-dependencies is a problem in and of itself, but again that exists with any package manager. (Because you will always need to pin dependencies before you go to prod right). So that 'lockfile' will get out of date pretty fast. Node at least has solutions for this that other communities don't [2] (I haven't tried this service).
It would be nice if the Go crowd and the Rust crowd, both of which are developing new package managers, had at least a common spec on how to describe dependencies.
This requires a programmer who writes a lot in both languages, who has enough
energy left to help with two package managers, and who happened to be in the
right places in the right moments to actually join the projects.
It's already very hard to find a programmer that writes a lot in more than one
language and cares about packaging/deploying software.
That's a problem for a team. Having written in both languages, I consider package management to be the biggest problem in both.
(And then there's catkin, the build system for the Robot Operating System. That's the build job from hell. ROS is a message-passing environment to which a huge number of existing packages have been adapted. Multiple languages, different underlying libraries, and no central control of versions across packages. It sort of works. Although there are bad days when it breaks the Ubuntu updater.)
The primary problems in that article are nothing to do with technology.
Consider these statements:
* "Tom is a genius", which implies the Tom solution is a trusted asset beyond improvement, doubt, or questioning. It also implies the solutions provided by Tom are golden unimproveable truths.
* The various statements from Scott suggest an indisputable faith in process and convention.
Clearly there are failures at multiple levels here. First of all Tom sounds like a whiny bitch. These personality types are inherently defensive and typically seek to reinforce an individual's position of self triumph in a small pond. Toxic.
Secondly, Scott has a lot of faith in process and conventions. Processes and conventions are the absolute enemy of creativity. I understand processes are necessary to establish a certain level of security, but they more typically exist to satisfy some OCD insanity where there is comfort in doing things in a particular way without consideration for why they are done in that way. Many developers cannot tell the different between security and superficial stupidity. Many technology abstractions enable that stupidity thereby convolute the differences between security and OCD stupid which only enables additional stupidity.
All of the prior mentioned failures are allowed to exist because the management doesn't want to be involved until there is a problem, such as Tom crying. This is called enabling.
Most important of all is that all technology should be questioned, doubted, and challenged. Obviously this sort of continuous improvement is utterly absent, because everybody has a competing agenda.
Process is the preventative of 'bikeshedding'. It's not so much about security as consistency, and it also prevents having the "why are we doing it this way" discussion every single time.
I think bikeshedding could exist irrespective of processes. Bikeshedding is the behavior of miss-prioritizing trivialities with greater imperative than core concerns because they are easier to think through. The solution for bikeshedding is prioritizing tasks in alignment to a written mission or plan. A documented plan described achievement of the plan where processes are the requirements to attain a certain level of conformance to a sub-goal. In other words the processes, of a well designed system, are the minimal number of barriers to get in the way just enough to prevent mission failure.
I think the correct term to use in this case, is
"When engineers start acting religiously..."
As a person doesn't need a religion, to act religious; I can be religious about canned goods or martial arts, or perhaps, engineering.
I do agree that using the term "religious people", while the meaning is understood, makes a generalised assumption about a larger group of people who do not all adhere to the imagined stereotype, and it casts them in a negative light.
I suppose that the point you're making is that zer0gravity used the term "religious people", and that seems to be acceptable to most people, few object to it;
whereas "people of color" or any other term that is deemed unacceptable to make that kind of negative assumption about, would not be okay.
And that's a rather big double-standard.
Regardless though, the meaning that zer0gravity was trying to convey, was clear and understood.
I think taking up offence at zer0gravity's comment and use of the term "religious people" just detracts from the topic being discussed;
and I highly doubt any ground will be crossed on the point you're trying to make; at least on this page and topic.
True, though this story sounded more like one engineer convinced of his godhood and a bunch of clueless managers who ensure that the only way you can keep your job is to believe in the greatness of the solution and its author...
As a JavaScript developer I feel the same way about Java... Unfortunately JavaScript developers are fully aware they aren't Java developers, but many Java developers don't seem to share this same realization.
I know of several people who choose to not waste their time with Facebook. Seriously, why? Any time I need to nudge somebody, that I don't care enough to bother with in the real world, there is Linked In. Everybody else either has my phone number, email address, or they are people I don't know.
>waste their time with Facebook
That's a bold statement. People like me use it as a lightweight option to stay abreast with whats going on with friends and family without having to invest in a long call or email. You may think I don't care, but I do- but its hard and time consuming to call 50 people and ask 'Did you go somewhere this summer? How was it? What did you see?'. Instead I just 'like' when I see a cool picture they share and when I call them I can start by saying 'Wow- you had such a good time in Paris- why don't you visit us next year?'
LinkedIn is a spammer paradise and a cancer for everybody else though. LinkedIn is designed decently but just as full of spam and overflow as Myspace was. Pretty much only worth using between jobs - but otherwise, I think there are much better platforms. They've become all about upselling: "Buy our premium membership and see the people who've seen your profile!" And they of course, sell your data and viewing habits without care.
It seems very common to me for people to act like there's no middle ground between not having a Facebook account and scrolling up and down the feed for hours a day. Personally, I use it for chat and events and that's basically it. Those two features however are very useful tools for me.
I just checked and the last thing I posted on my wall was a year ago. The thing before that 6 months before. Hardly a time sink.
He is at 13% now and needs only to be at 15% to enter the national spotlight of increased media coverage and access to the general debates. His polling numbers are increasing with a notable momentum as general support of the Republican candidate and Democratic candidate are both at all time lows.
As long as the majority of voters think of Trump as a broke racist narcissist and Clinton as an apathetic or unethical elitist he may well have a chance.
13% huh? That's a bit better than I thought. Definitely sounds like he has a shot of making it to 15%. I'm all for it. I'd love to have him in the debates. I really liked this ad they made:
To save everyone else the click: I had heard the 13% figure before, but it was for favorable demographics (like younger people or Bernie supporters), so I was expecting the link to contradict you.
But no, the 13% was one poll's result for Johnson's general support, not simply a favorable demographic for him.
I think you underestimate the size of the "precariat" that feels it's on a trajectory towards becoming economically and culturally disenfranchised. The educated elites are quite disconnected with the population from the lower half of the "middle class" and downward. Look at what's happening in the Internet driven media of today. The lion's share of the volume online is outrage politics.
It's the young and economically squeezed vote you need to keep track of, both on the right and left. Trump and Sanders both knew this.
True. Not saying it can't happen. I just think it's unlikely. I think Johnson is currently polling somewhere around 10%? I guess the big hurdle will be getting in the debates like Perot did. Not sure what the current rules are for that.
Who has the better chance of hitting 15% and getting into the debates, Johnson or Jill Stein? Honest question, in case I am ever polled I want to choose the most likely alternative.
The other clowns who ran for the Libertarian party nomination make it really hard to take Gary Johnson seriously. That he is their candidate really just says he meets the very, very low bar that was set by the other candidates.
A two-time governor running with another two-time governor as his VP? Both former Republicans overwhelmingly re-elected in Democratic states? Both fiscally conservative and socially liberal?
Not to mention that, while I do not personally agree with Gary Johnson on many (I consistently score the lowest with him on the I side with quiz) of his policies, it's hard to ignore that quite a few of the policies he has enacted as Governor have actually worked quite well. Admittedly, I have not researched this much. I simply haven't had time this summer and when I did look at this, back in May, it didn't look like he'd get to 15%.
Both he and his running mate (Bill Weld) are 2-term governors. I don't know why their primary opponents McAfee or Petersen would detract from that. Are you under the impression that they chose their opponents?
This could always happen in the US. US military officers swear an oath to defend the constitution against all enemies foreign and domestic, but they don't swear an oath to other officers or elected officials. That said if an elected president violates the constitution the US military could qualify a military coup to restore law.
Many posters in here indicate oddness that a military force could represent secularism. I am curious why that is. Most uniformed militaries I have observed have always seemed more secular than the people they represent. Let us not forget the Islamic Brotherhood was democratically elected in Egypt, which attempted to instill sharia law. A military coup ended that nonsense. Also, Hitler was democratically elected and was not immediately supported by the military.
> US military officers swear an oath to defend the constitution against all enemies foreign and domestic,
True, but...
> but they don't swear an oath to other officers or elected officials.
... this part is, interestingly true, but only of officers. Officers don't swear an oath to other offices or elected officials, but enlisted personnel do both -- first to the President and second to the officers appointed over them, and all subject to the UCMJ. [0]
Not, of course, that oaths constrain behavior, but, to the extent oaths are relevant, in any case, the Constitution names the President Commander-in-Chief of the military and gives Congress the power to regulate the military, so an oath by a military member to support the Constitution necessarily involves a degree of commitment to elected officials, and to the extent those elected officials have used their Conditional powers to establish and appoint superior officers, those other officers.
Perhaps the necessary enough justification could be that an elected president does something that directly and literally violates an article written into the actual constitution and the military seeks to appoint a new leader from the line of succession. That way there is no violation of the principals of Cincinnatus and an elected civilian retains the position opposed to an unelected military figure.
It could. Also pigs could learn aviation. It is unlikely for either to happen, however.
The U.S. has a built-in coup mechanism: the president effectively serves at the convenience of Congress. While he/she is an elected official and in a different branch of government, the constitution was pretty clear on who runs the show. The House and Senate can, at any time, remove the president from office.
It'd be a helluva thing to do, but military coups are fairly spectacular also. On any given day that both houses were in session, given a hated enough president, you could probably do the entire thing on a voice vote in an hour or so. At that point the military and police authorities would be responsible for removing the president and swearing in the new one.
You might be able to spin out some scenarios for a U.S. coup d'etat, but there is a very long line of tradition in the states that the military serves the civilian authority. I think most command-level officers are fairly well indoctrinated in avoiding that scenario at all costs.
> Many posters in here indicate oddness that a military force could represent secularism. I am curious why that is.
My hypothesis is that the average English speaking tech worker is both more secular than their nation's population and more secular than their nation's military, making it more difficult to think of militaries as secular organizations. I know I tend to think of the U.S. military as full of religious conservatives, but that probably says more about how secular I am than about how religious the military is.
The Knox laws are still in force. There is no legal framework for an American military coup, setting aside that the Commander in Chief is an elected office.
Part of running a coup in a democratic country is that your ground troops have to be on your side. This is much more difficult when every single soldier is made completely aware during indoctrination that domestic military action is almost completely illegal. Adding in the fact that there is no conscription in America, the likelihood of a military coup today is vanishingly small.
Yes, that is what many of us do. We either leave the tech industry, or tech positions, or the area. There is no "shortage", it is just that "unfillable" positions dont pay a living wage for the locale+experience.
Uhm, usually because living in the Bay Area is actually a 10x better life for some people. Great weather, very diverse, lots of jobs, public transit is good, arts are amazing, food is amazing, close to sea and ski, etc. etc.
Somehow I doubt that any of that has too much effect on quality of life. I live on less than 400 USD per month and I bet I'm happier than the average Bay Area resident. I also know that I eat far better. Via the internet I have access to all the arts I could ever want. My transit is a bicycle, which is ideal for me. My city is FAR more diverse than anywhere in the US and I'm a few hours from the sea. Weather is a bit hot but I like that. But like I said, pretty sure whether I'm happy vs. sad derives from something else entirely.
Phnom Penh. It's a really nice city if you stay away from places where expats congregate. Lack of open spaces and fields though are something that I miss. It's dirty too, but its other charms make up for it. Every day I buy the best fruits and vegetables I've ever tasted at the market near my apartment.
The bay area art scene could be better, but I doubt 10x better. Fort Worth, which I used to compare in my previous comment, has some pretty strong art museums recognized internationally.
Soon Fort Worth will over take San Fransisco in population, and at the last recession was far less disrupted economically. Speaking purely from data and statistics Fort Worth is more diverse than San Fransisco with regards to ethnicity, religion, nationality, and so forth. And I doubt there is a substantial difference in the diversity of food options since I have been to San Fransisco. The weather and public transit are likely superior though.
Otherwise, if I were single and had 10x the income I would likely agree that the bay area is a more fun place to be.