I don't want to turn this into a philosophical debate, but if what you are saying is true (I hope not), the community is wrong IMHO.
Human languages defines what concepts we are able to think and reason about. I believe that George Orwell showed this perfectly in 1984. By taking away a lot of vocabulary, you are taking away a lot of ways of to express yourself. I think that the term "expression" in regards to programming languages explains why this is important.
I only know Dutch, English and Latin (plus just enough German and French to get around when I´m buying a bread or asking for directions) so in no way I've experienced how a language like Mandarin changes the way in which your thoughts take shape. However, I did have the experience of playing around with Haskell and Lisp (not saying I'm experienced or whatever, I've just toyed around with them in my free time) and they did change some things of how I percieve and think about code. I can´t write code in these languages where I work, but just learning them (and learning their different, but just as valid, viewpont on programming) has changed the way I think about some problems and how I should solve some problems.
Language (whether it's human or programming) is all about expressing ideas in the broadest sense of the word. Knowing more just let's you express more...
The thing about Java developer is that they're getting easier to be replaced (outsource or offshore).
That might not be the case if you're specialized in SAP (ABAP, FICO, ERP), Oracle (PeopleSoft, CRM, Financial). Sometime it is due to the nature of the information you're dealing with.
I decided to quit pursuing technical excellence and heavily thinking to switch career to either DBA (I've been taking database courses at a local polytechnic college) or Sys/Net-Admin. More of the "Ops" kind of career for many good reasons.
I realized that what I like is to build software from a non-programming perspective (architect, designer, owner). So I decided to look for a lucrative yet stable career, save my money for the long run, and outsource programming jobs.
I know people have distaste with the word "outsource". But outsource doesn't necessarily mean to India. It could also mean to contract out part of the programming jobs to a local talent. I also have the advantage of being born on the other side of the world (somewhere in SE Asia) so I have 2 talent pools to choose from.
I viewed my moves as a series of problem solving steps. Perhaps you should too.
Other than Razor (which is optional), not that much has changed and it is a pretty painless upgrade. Under the old way they used to version software, this is really more like ASP.NET MVC 1.5 SP1
The other thing worth noting is way better support for dependency injection, through the DependecyResolver class. I'm becoming particularly attached to the IViewPageActivator interface for providing custom view creation logic (in my case, injection of helpers and custom session information into the page object).
While I do enjoy programming, I don't think software development is a good industry to have a career in. The market is so fragmented right now with different choices of programming languages and different technology requirements. And you are expected to be the expert on those technologies.
I like what I do and it's good to be paid at what you do. But these days corporations want their people to work extra hard and dedicate themselves for the company. Once the company is done with you, they'll spit you out.
I've had my first-hand experience seeing someone who helped built the company, loved his job, work harder than anyone else in the company, was being let-go when the company didn't do well. Probably because he has a high salary than anyone else. The least the company could do is to negotiate his salary and let go someone else (still not ideal, but then..).
Long gone is the day of "employee growth", "personal advancement", or "continuous improvement". Enter the day of "you should already know that", "study this on your free time".
With the rise of Scrum and XP, developers are becoming interchangeable (outsource, offshore, or between colleagues).
"While I do enjoy programming, I don't think software development is a good industry to have a career in. The market is so fragmented right now with different choices of programming languages and different technology requirements. And you are expected to be the expert on those technologies."
Exactly, and I suspect that many of the developers being "purged" are not those who aren't passionate, but those who happen to be too much of a generalist.
Today's job listings often ask for experience with libraries A, B, C, D and E. (It's not even about languages anymore; the "Python" jobs are really mostly for Django, and "Ruby" already meant "Rails".) They get literally hundreds of replies, so the chances of getting applications from people who actually do know all those libraries are fairly high. You just pick one of those who seems competent (maybe you'll let them do some of those puzzle questions that seem to be in vogue nowadays) and they can bring value to your project right away.
A generalist, on the other hand, loses out here, because they do not know all of these libraries (yet). It does not matter if they are competent and can learn these new things quickly; they aren't even considered.
So, you just learn these libraries, and next time you will have a better chance of getting the job, right? Yeah, except that the market is so fragmented, that the next job listing will require F, G, H, I and J. There are just too many of them to learn them all.
I'm getting the strong impression that getting a job nowadays is a matter of (1) luck, or (2) knowing the right people. Many companies like to pretend it's about passion, but other filters are applied first.
I do a lot of Python and have even worked on my own web frameworks before. I've done web development with many different Python web frameworks. I also don't consider myself just a web developer.
I was recently interviewing for a Python/Django developer position and was instead offered a SDET position (with the possibility of becoming a developer) because I don't have experience with Django.
I turned them down because it felt like an insult.
I think the term 'generalist' here is a bit unknown.
If you're a developer working on web-based software, you are known as the web-developer guy (specialist).
If you're a developer working on transactional, security, concurrency, you are known as the back-end guy (specialist).
If you're a developer working on business-app, you are known as the enterprise developer (specialist).
If you're working on product-based company developing virtualization, you're specializing.
Developer who work at Google, while they might not used the varieties of framework out there, are specializing in Algorithms, Data Structure, Concurrency, and Large Scale development.
If you're looking for work as an internal business-app developer writing JEE components, your recruiter might not want to accept Algorithm geniuses who used to work at Google.
A Rails guy will look down on someone who has 7 years of Java experience whom tries to escape Java at the same time but wouldn't say something bad about his work experiences (not burning bridges).
These days people shaped their mind and view pretty quick.
You mention that job listings get hundreds of replies, yet just a few pages back we have an article [1] about the short supply of developers in Silicon Valley. What gives?
Silicon Valley wants the best and brightest. This filter is opinionated based on the company.
If you're looking for work at VMWare, your Rails and MongoDB experience won't be needed. They want C/C++ guy who can hack Linux kernel or write device driver.
If you're looking for work at Facebook, they want someone who knows PHP and C/C++ and probably have provided patches to MySQL or memcached. Or they grab "idea" people like Lars (from Google).
You should also "own" that domain (compiler, AI, optimization, OS/Kernel, Networking, DB/Data-Mining, business-app, ERP/CRM/Enterprise).
This is scary because it forces you to:
1) Put your eggs in one basket: domain
2) And put more eggs in another basket: technology
3) Stay up with the latest in (1) and (2)
This left you no room to breathe. To break this cycle you probably have to get out there and do something on your own (meaning: your own business/company)
I don't know. One doesn't exclude the other, I suppose. I have heard about developers being in short supply as well. Notice that they are not saying "nobody replies to my job posting", so it's quite possible they get lots of replies, they just end up picking none of them, for whatever reason. Maybe it's hard to filter through all the replies effectively, maybe they are too picky, maybe the job is unattractive to competent developers, maybe it's something else. It's hard to say.
All I do know is that I have heard the other story as well, i.e. people who are hiring are inundated in replies.
I don't know besides my company, but as a generalist technical founder, I would be loathe to take someone with experience in x technology over a generalist who'd shown ability to learn new technologies. We work with Flash, clojure, and assembly, which is obviously the weirdest smorgasbord ever. I have yet to see a resume that mentions anything but flash, so I've tried to look for people who look smart and a little arrogant instead, while still being personable.
Being an engineer, constantly learning is a given. Technologies change everyday. That happens from hardware all the way to the top. Don't bet on just one technology and think your career will end with it. Java, Python, Ruby, and many others are things people talk about, but I guarantee you, there will be more new languages, new technologies and new platforms pop up every now and then.
I think this is the biggest difference between programming language and OS/RBDMS. The changes happening around programming languages are much faster than that of the OS/RDBMS world.
People who know UNIX concepts are most likely employable whether the business is using HP-UX, Solaris, Linux, or BSD. You're a DBA? doesn't matter much how many years of experience you have between Oracle and SQL-Server (as long as each have at least 1 year).
Compare that to a developer who know OOP in Java and JEE is less likely employable in shops that use the equivalent technology in C#.NET (even if the concept and the language is the same).
My point here is that the changes in programming language/development tools are in a very unhealthy situation. People don't even have time to stabilize and learn other skills (communication, organization, networking, understanding business needs).
Agile is such a generalization. There are many practices below Agile such as Scrum, XP, Lean, Kanban, DSDM, FDD, etc.
Agile itself is just a set of principles. The actual implementation still require some processes.
I don't think people should attack Agile as a whole, but they should focus on which practices they chose. For example: Scrum is an agile practice for Project Management. But in order for a shop to use Scrum, they must implement certain practices from XP. XP more of an agile practice for Software Developers.
There are shops which implement only SCRUM but not the XP part. That is to say that they implement the whole Sprint, Retrospectives, Backlog, Stand-up meeting, but they did not refactor bad code, they did not try their best to have an automation in place. At the end of each sprint, they never test the whole app, they only test whatever the sprint accomplished. This is even worse than the previous practice.
I wholeheartedly agree with those who have been burned by management who thinks that SCRUM or XP will make things better in a short time. I hate to say this but it takes a very long time for a company to change their culture and mindset (depending on the size of the company and what the company does: product vs service).
If your company is switching to something new, make sure they know that methodology inside out. Make sure they met the pre-requisite. Make sure they are willing to sacrifice their time and money for a while.
Otherwise, get your resume ready cause the ship will sink faster than before.
On the other hand, some people might not agree with this but when you have one of these processes in place and everything in your engineering department runs well, that doesn't mean life is good. When the company is not doing well, your senior engineers who receive a big-fat check every month might be a good candidate to be let go. The reason is because your intermediate and junior engineers have already known everything inside out (cause one of the practice in Agile is to share knowledge, cross-functional team, or whatever).
To me, (as a "enterprise" developer), discipline means knowing when to say NO to a feature. I think traditionally Enterprise software is developed with the crazy "advanced" (5%) scenario in mind vs. the "regular" (95%) use cases.
Edit: Just re-read the article an realized he said the exact same thing (95% vs 5%).
I'm actually flirting to move to everything Oracle for my career (seriously). There's a big money and (hopefully) less work for those who know how to use 10% of Oracle's other software aside from their database.
Now I can go home by 5 and have a side project/job on the evening and weekend using Emacs, Ruby, Rails, and jQuery.
Careful. What you say is somewhat true. But that side project/job doing what you really love is going to be increasingly starved for time. You may want to have a family or friends or hike or read or learn to play piano at some point. You'll be stuck with a somewhat non-enjoyable day job where much of the focus is on Oracle's tool stack.
That said, I completely poked my head-up after grad school, saw that there was some demand for skills that started with the "Oracle" adjective, and have found reasonably stable and lucrative work in those spaces.
I've also been able to move to a more open shop, where I've been able to sneak in some RoR and am currently hacking some Lisp to do, of all things, some ETL/data transformation stuff to make my life a little easier.
My bigger issue is that if you position yourself as the Oracle stack expert is that you may windup stuck in that professionally. I laughed when I read it, but I remember PG saying something about never being worried about competitors to Viaweb who advertised for Oracle programmers. It's a bit of an exaggeration, but the hackers I respect the most have been somewhat disappointed in their corporate "big IT" careers.
Thank you for your response and to share your experience as well. Greatly appreciate it. It would be great if I could have some stability in my career and yet still able to work on something else (be it RoR or LISP or something else).
<rant>
I've been working in a few, of what people would call "software product", companies. The one that Joel mentioned a lot in his essay.
First of all, if I were in Silicon Valley, I wouldn't probably even think of Oracle. I would hone my CS skill so bad that I could hack my own compiler. Alas, I don't live there. Even if I do, I have an expiration date stamped on my head: good only for 10 years after graduation. Silicon Valley wants hot-shot, young, energetic, red-bull-drinker, all-nighters type of programmer. So if I can't be a "product manager" or "CTO" or management material by the time I'm 32-35 years old, time to get out from SV.
Product based company tend to be unstable in where I live. I also noticed this trend isn't particular to where I live, it's everywhere else too. When the product is not selling well, lay-off happens. The first to go are usually the QAs.
Once the QAs are gone, the next one to go are the "internal" tools developers: these are the ITs and the tool developers. The last one to go are the junior developers and weak performers. The ones who left must work super hard to prove that their worth of their salary.
Product based company is cool during their first 3 years. After that, it's all maintenance of legacy and hacked-up code. I don't know which one is worse: maintaining half-ass, hacked-up, badly designed product with tight-deadline (that usually leads to OTs) or writing PL/SQL or customizing Oracle modules.
Here's another problem: standards, scrum, xp. These are great things to have in a software product house for quality and longevity of the company. But at the same time, they are a double-edged sword.
Let me explain why: once the standards are in place, most people are replaceable. Take Scrum/XP for example. One of their important points is that we all should share knowledge (via Pair Programming, or something else).
They want to make the unknown to become lesser unknown or to be known. Once it is repeatable and known, you have no value anymore to the company. Intermediate becomes Senior, Junior becomes Intermediate, and you're being let-go and they will start to hire new people.
The choice is either to move up to management or not to do scrum/xp (which is equally horrific). Not doing scrum/xp would lead to bad result, bad quality, unhealthy working environment, and the need of a hero-like effort to fix some bugs.
I have seen my friends keep changing job within 2-3 years. That might be okay with them but not with me. I don't like to waste my time preparing for interviews, cleaning up my resume, every 2-3 years. I don't mind learning and improving myself, but not for the sake of that kind of cycle.
Here's another problem with software development: programming languages. Too freaking many of them. People have too many opinions. Some like LISP, some like Java, some swear by .NET, some would invent a company based on F#, some deal with Struts1/2, some would want PHP/Drupal/Wordpress. This leads to a very fragmented field.
I rarely see a company that is looking for the bare minimum (say, Java, or C#, or Ruby) but with X-years of experience. I often see companies looking for specifics (must know Java, Struts, XML, XQuery, XPath, XSLT with 7 years of experience).
I thought about doing Rails and iPhone for a while until a couple days ago where it hits me that you can actually outsource iPhone app development. Those 2 guys that were being interviewed by Mixergy did exactly that.
I also see a few consulting offers lingering in a local job-board looking for an iPhone/Android developer. But most of them are unstable due to the nature of consulting. I don't think they're willing to pay the premium ($100-$125) anyway.
Some consultants might be able to charge premium during the first few years of a new technology (like iPhone), but they need to find the next big-thing again every 2 years.
The point is this: low entry barrier sucks.
When I look at Oracle, the barrier to enter is a bit higher (or so it seems) than being a developer and not too many people want to do the job. It's a niche. Just like what one of the HN-ers mentioned about how he did quite well with his freelancing/consulting gig (he's doing PHP, Drupal, and Wordpress)
</rant>
So, if I understand you correctly, you'd like a 9-to-5 job where you won't be forced into management and be able to keep your valuable skills to yourself while you work on more interesting things after hours?
I think I am currently in that situation. It's not all it's cracked up to be. First off, your time on nights and weekends is not as long or of the same quality as 9 to 5. You don't have the same energy level or focus.
As for the job, no job is immune to change. If you have skills that are valuable to your employer, you'll find that either your employer will overwhelm you with work and/or they'll seek to have your co-workers gain those skills to distribute the workload. This kind of stuff will make you more and more unhappy with your job and this will surprisingly make it harder to work on your side projects outside of work. If you put all your eggs in one basket by anchoring yourself to the Oracle stack, when you go to look for another job (because you're tired of your current job), prospective employers (and especially recruiters) will see mostly Oracle stuff and you'll be pigeonholed into jobs that are similar to your current job.
Don't make perceived stability the central focus of your job search (unless you have others that depend on you financially). Choose jobs that will serve you better with respect to your career goals. That may mean changing jobs more often than you'd like (but not necessarily so), but you'll get to where you want to be sooner.
I don't mind occasional OT (with compensation). Fact is, in where I live, companies can get away with lots of OT and not compensating them. I don't mind to go to management position late in my career. Not now, but maybe later.
Interesting things can have different means. To me, Google architecture (GFS, BigTable, MapReduce) are all interesting, but I have no desire to learn them for the sake of learning. Rails is interesting in the sense that I'd like to learn it, make a website using it, and take a poke at running a small business.
I'm also interested at the business of iPhone apps. I'd rather pay someone else to write the app than writing it myself. I prefer to focus on the operational side: making sure we have a website with good SEO and graphics design. Promote the iPhone apps and start making money (even with a low profitability margin).
But at the same time, I'm not a big gambler or a risk taker. Steady fixed income and experimental on the side is my sweet spot.
In short, I'd like to be able to run my own side business, be it iPhone apps or selling stuff online. That excites me more vs toying with various programming languages or solving hard algorithm problems. I prefer to talk with people than with machines. I don't mind to do occasional programming cause I like to build stuff. I just don't want them to dominate my whole life. I prefer not to be 40 or 50 and still hacking C, UNIX, Java, C# or Ruby for a living.
I prefer to spend 100% of my time on benefiting from the growing gap left between the Oracles, Microsofts and SAPs on one end and the integrate a bunch of FOSS approach on the other. I think there is room for simple, value for money, intelligent, low maintainance software, even in the enterprise.
its kind of hard to describe, lengthy, and it could just be my own limited experience.
lets just go with this -- typically, when a company makes the decision to go with oracle, they've made it for business/marketing reasons, not technical reasons. and this is probably how they're going to make the majority of the rest of their decisions.
I understand and well aware of that. Having said that, I also often see companies purchase Oracle for technical and business reason. I went and saw myself first hand experience a multi-financial company that opted for Oracle (as opposed to SQL-Server, which they benchmarked) to build their financial system and they are pleased with that.
No argument between TDD, BDD, DDD, C# vs Java, Linux vs Windows, Commercial vs FOSS and all that crap. Just Oracle, PL/SQL, Forms, and Oracle Financial. There's no Java code at all around there. Pure "module" based. Testing is a lot easier vs to test your own "financial" module written in JEE.
I see it as more practical and suits the business well.
Of course if the business requires infinite customization (in the case of that crazy idea called Business Process Redesign), then maybe hiring a team of software developer is better than buying Oracle.
There's some amount of problem solving and planning involved. But not as challenging as to implement the next distributed storage mechanism. I'm probably closer to done for that world. I'm not a research scientist.
I'm looking at the specific area: financial services (not trading, but more traditional than that: transactions, accounting, etc). I'd love to learn about finances more in conjunction with doing IT related work.
The important message is this: "focus on the process to build a great company and great products... They didn’t become successful because they built these systems. They built these systems because they became successful."
The next unproven startup that uses NoSQL solution is just wasting their time. I know this because that happened to one of my previous workplaces (we use HBase).
At the last startup where I worked, we built our own document store (in Scheme!), with exactly the same reasoning that causes other startups to over-engineer for scale before we had any users.
Premature optimization is a kind of over-engineering. Thankfully, it became so frequently hammered into new coders' heads that premature optimization is a Bad Thing that it's no longer much of a worry. It seems to have been replaced with premature scaling in the past few years, though. Anyone that could solve the problem of rampant over-engineering could potentially wield the awesome power whispered about in legends.
The problem is that over-engineering often has its roots in the very same drives that brought many of us into technology in the first place: the opportunity to work on something "really cool." As a developer turned manager, I've seen both sides of the coin, and it is a delicate balancing act. You don't want to completely stifle innovation (and morale) by being hard-headed about over-engineering. Conversely, you want to direct that energy appropriately so that you don't create a culture of "under-engineering" and unhappy developers.
I totally agree. I may be totally wrong on this point, but it seems to me that over-engineering (in the forms of feature-creep, premature optimization, premature scaling, over-specification, planning for too many unlikely use cases, and all its other myriad forms) is often a tunnel-vision problem, a feature of the blessing/curse of engineers' tendency to focus really hard on a single thing.
If you look at the single-page source code to Plan 9's implementation of the cat command, and then compare it to GNU's, you can almost see it directly. The Plan 9 coders were looking more broadly at the OS and knew they'd need a cat command, while the GNU coders were thinking just about the Unix userspace, so when Plan 9 had cat, they continued hacking by moving to the next thing, but when GNU had it, they continued hacking by stuffing more things into cat.
Hackers love hacking, and this is great for the hackers and the users, but one rarely hears "This program is sufficient. I will consider it complete and write a different one until it becomes clear that the program is insufficient." (I don't exempt myself here.) Or maybe it's just too apparent when the designers forget it, so it stands out.
It's even more apparent on the web, where the default mode of a site is to see itself as a site rather than a component of the web at large.
At the same time though, you're saying you failed solely because you used HBase? Sure twitter's architecture evolved out of necessity but starting out with some NoSQL solution doesn't automatically set you up for failure.
That's one of the tough myths to overcome when building on top of a relational database. Changing the Database has traditionally been Hard, so everybody has been trained to think that way, therefore nobody is allowed to change the database because Changing the Database is hard. Try it at your bigco and the old guys will do everything in their power to stop you (thus making changing the database hard to do).
Ignore that rule and build yourself an environment where changing the database is easy. I make schema changes to my stuff all the time, and seldom push a release live that doesn't do so. The tools are in place to ensure that it's No Big Deal, so it just works.
If you live in a world where changing your SQL database is easy, it sort of takes the wind out of the "start with NoSQL, because changing the database is easy" argument. You get all the speed advantages of being schema-flexible, and you can write ad-hoc queries when you want to, so you're flexible in that direction too.
Let me clarify - I'm a startup hacker, not a bigco guy at all :)
And I still use MySQL all the time, right alongside so-called NoSQL solutions where they are better fit to a given purpose: Membase for high-availability collections on the order of billions-of-records in a social game; MySQL for defining the game world itself; mongoDB for any and all data for which eventual-consistency doesn't matter (e.g. analytics). I've streamlined my MySQL dealings in precisely the ways you outlined. I have change scripts for every schema change, and I have YML-driven schema auto-generation in Symfony and declarative. Schema changes are pushed out through staging to production - lazily when possible, and actively when possible.
But despite all the process improvements in the world, the dev time savings that go along with a smart, document-oriented model layer are not a myth. I assure you, they are very real. No longer does every new feature have to start with schema design (no matter how streamlined your schema alteration process may be, it has a nonzero cost, and one which definitely increases with scale). You can instead just get right to the code, and start setting and getting the properties your new feature will need.
The fact remains: SQL is not a one-size-fits-all solution any more than the recent "massively scalable" data stores are. A modern backend engineer should know a lot about a variety of datastore solutions, and should think long and hard about which data should be stored in which manner(s).
- Change scripts for every schema change, stored in source control.
- An automated build/deploy that pulls down new change scripts and executes them in order.
- (optional) a good way to generate your backend CRUD by looking at the existing database schema, or as a lesser option an ORM that does the same thing.
So your workflow is: script out your schema change, check it in, apply it to your dev environment, regenerate the CRUD from your local schema, fix any compile errors that you've introduced. (and optionally make sure all your unit tests still pass).
You'll notice that all that stuff above is just the basic workflow you should have in place anyway.
The problem is not whether you can, but how long does it take. At least with mysql, alter table for a large table will take a long time (hours, if not more), during which the table cannot be written to.
That's not representative of RDBMSs in general. In Oracle we don't think twice about adding a column to a table during production hours. The only issue is if the new column has a default value and you have millions of rows that you need to "backfill" but that's just a big transaction; it's nothing remarkable in and of itself, if you could do a transaction that big anyway, you'd just go ahead and do it. And of course, in Oracle readers don't block writers and writers don't block readers, we have MVCC.
Once again, NoSQL is shown to be a reaction against MySQL, not RDBMSs in general.
That was but one example out of many. Oracle is a great database. It isn't great at everything, however. Geographically distributed, fault tolerant, scale out architectures as often required for big online services? Not a great fit. Multi-petabyte complex analytics processing? Not a great fit.
There is a set of relational database folks fixated on the false dichotomy of relational databases OR non-relational databases. In practice, they are often combined in a variety of ways. Insisting on One True Database is like insisting on One True Operating System or One True Programming Language. Stop obsessing over tools and build useful stuff!
Oh indeed, right tool for the job, I'm 100% with you there.
The issue is, from the NoSQL camp we hear about "schema rigidity". We hear "SQL doesn't scale". These things simply aren't true! It's as if someone had only ever written .BAT files on DOS and thought that all its limitations applied to Python as well (and went and told experienced professional Python devs that!).
There have been "object repositories" such as Versant for a long time. The NoSQL types seem oblivious to these too.
Schema rigidity is a canard originating with the same folks who think a hash is a type system. I encourage you to dismiss the folks who say things like that, rather than dismissing some very useful technologies.
yes, but oracle is expensive, especially since it requires people who really know about oracle if you want to be up to speed relatively quickly. So as always, it is a tradeoff: it seems that in some cases, not having the usual RDBMS guarantees is ok because there are less admin costs, etc...
So sure, some people don't understand those tradeoff and make stupid choices. But people who choose technology without properly assessing the risks/advantages are bound to fail anyway.
If your major expense is people who know what they're doing, then Oracle is not even the most expensive platform... During any web boom I bet LAMP guys were billing higher hourly rates than people doing Oracle!
Google and Facebook have all the money and talent required to deploy epic Oracle systems. Instead they use GFS and BigTable and Cassandra and HBase and Scribe and Hadoop and MySQL and a host of other systems. Amazon has massive Oracle deployments, so plenty of money and knowledge on the topic, but still built Dynamo and S3.
There are billions of dollars riding on this for them. Instead of insisting they are wrong, ask yourself why they might be right.
Google and Facebook aren't good examples, because they have no hard transactional requirements for their main applications. If a web page isn't included in one search result but is in the same search executed on a different node 5 minutes later, who would notice or care? If a status update gets dropped, it might be annoying, but you can always just resend it.
If you want to compare like for like, ask why Visa isn't using MongoDB for authorizations, or why American Airlines isn't using Redis for reservations.
So you are aware, yours is the traditional response. Somehow, doing what Google and Facebook do is "easy" because not everything requires transactions. This is false both because scale like that makes almost everything difficult, and because, as Google recently published, they are using transactions for their main application. NoSQL does not imply lack of transactions and transactions do not imply relational databases.
I doubt that any DB that stores data physically in a row oriented format doesn't have issues with this particular type of change. But of course there are many other ways to work around that. For instance, you could just create a new table and a view to join the two. Or, if a particular table changes all the time, redesign it to store key value pairs instead. RDBMS can easily be used in a schemaless fashion if needed for particular scenarios.
Whether it saves time or not depends on a lot of things. If users actually determine the structure of your data then you're right. If it's just about schema evolution by developers I don't think that moving schema constraints into procedural code makes things simpler or more flexible.
RDBMS does require a bit of planning (not too much, and not too rigid) and previous experiences building good data model; both traits don't exist in hotshot hackers/startups these days because they either don't have the patience or much deeper experience.
I seriously doubt that this is true. SQL databases are extremely easy to adjust. Hell you can add columns on your live database in the middle of the day if you want. Removing data is harder but normally you don't have to delete columns just to launch your new code.
Another thing to remember is that most SQL systems are mature and there is pretty much always very good tools available to do any kind of changes you may need to do.
The existence of great tools for relational databases is a compelling argument for using them. As I said in the article, starting off using a single, monolithic relational store is a successful approach employed by many, successful companies. I would suggest, though, that the rest of your comment indicates a lack of experience with relational databases at large-scale. One metric provided by Twitter in a presentation was that an ALTER TABLE command took 2 weeks to run on a previously centralized relational database. Perhaps someone from Twitter can add some color to that anecdote.
Like I say, in business there is no cheap or expensive. There's worth the money, or not.
Yes, Oracle costs money. But so does "rolling your own". How much of their VCs cash has say Twitter spent doing that? Yes, Oracle "locks you in". But your own legacy code locks you in whatever platform you've built it on.
In the specific case of FlockDB, that's actually not the case as the shards are modular. SQLShard is one implementation. There was an experimental Redis shard implementation, as well. You might be both overestimating the cost of building things like this and underestimating the enormous drag of using closed source components in these systems. If you've personally built online services this big with Oracle, well done. What did it cost?
Once you have hash joins, shards start to look awfully restrictive compared to partitions... MySQL (et al, I don't know about FlockDB) can freely shard because they're not losing functionality they don't have in the first place.
Maybe I'm not a good example because I've mainly worked in financial services, but nearly every Oracle project I've worked on has been wildly profitable, and most have been at the level (in terms of transactional throughput, and in terms of features that were cheaper to buy than build) at which there are only really two choices, Oracle or DB2.
Vendors highly optimize for your case, and you find that their solutions work for you. Not a surprise. Different constraints and different requirements produce different solutions. Nobody is saying you are doing financial services storage and processing wrong. Why are you so insistent experts in a totally different field are doing their jobs wrong?
Absent a metric for "scale" it isn't false, it's meaningless. Here is some context to help you distinguish how things work for online services as opposed to financials:
As you are not printing money, you care about cost efficiency. This means you are biased towards using white box hardware, and having as little variation as possible. In the financial world, they use name brand hardware and whatever configurations make sense for a specific application, which is good, because Oracle and IBM are not interested in supporting their products on white boxes. You're Twitter, you have 15 billion edges in this database. Each one is conservatively 24 bytes. 360GB of raw data. Everything you do depends on that data being always available, extremely tolerant of component failures, accessible with very low latency, so assume it all has to be in RAM on a bunch of machines. The Oracle answer is a cluster of fat, named-brand servers (totally unlike all the rest of your hardware) with a dedicated interconnect, a lot of license fees, support contracts, and dedicated DBAs (plural, since you'll need them on-call). Special hardware, headcount, license and support costs. Hit a bug? Call Oracle. Need a feature? Call Oracle. Wish that precious headcount was filled by engineers writing code instead of DBAs carrying pagers? Too bad.
When scaling includes having to fit into the same model as all the rest of your infrastructure, and that model is not the Oracle model, then it becomes a bit clearer why engineers might dismiss Oracle. They are left with those inferior options like MySQL, which you've already agreed has various scaling issues (though Facebook somehow manages). So, they invest a few months of a few engineers and they have a purpose built system that does what they need, fits the rest of their model, for which they have the source, and for which the maintenance costs are likely to be far lower.
Like I said in the article: absent a specific problem for the business, SQL vs NoSQL is just noise. Something like FlockDB is far simpler and cheaper than throwing Oracle at the problem. That's not a technical argument, it's a business argument. If you want to argue that the big players don't know how to run their businesses, I will not try to stop you.
You can, right now, go to Dell's website and buy, off the shelf, a server with 144G of RAM. Stick Red Hat on it and Oracle considers it fully supported. The days of "vanity" brands that you'd stick behind a glass partition and take your investors on a tour of the datacentre to see are loooong gone.
And the reason you need devs and DBAs to be separate isn't one of different skills at all, both speak PL/SQL fluently. It's just if the regulator of your industry requires that the code be developed and deployed by different people. If not, it's normal for the two camps to have significant overlap.
Yes, you can go buy name-brand hardware (Dell, IBM, Sun, HP) in a special configuration not used anywhere else in your infrastructure and install an OS on it which you don't run on any other system, all just so you can have the honor of paying Oracle. That was my point.
Eh? Dell is very much not a brand like IBM or HP, they are pioneers in the field of cheap generic hardware. That's why I chose them as an example! And Red Hat is hardly an obscure OS these days either... And Oracle runs happily on Windows or many other common OSs too...
What scale? How many concurrent users of your specific site break MySQL, and how many concurrent users do you have? Not answering or considering those questions is what leads to premature scaling.
Since I cannot edit my original comment (no idea why), I'd put my update here:
Looks like the people that replied my post got into deep technical discussion without mentioning a startup in the scale of Twitter. Exactly like what the author pointed on: no successful product yet.
I hope people can see a pattern here:
Twitter: FlockDB, and varieties of supporting infrastructure
Facebook: Cassandra and in-house tools
Google : GFS, BigTable, and in-house tools
LinkedIn: Voldermort + varieties of supporting tools
Each successful startup build NoSQL solution based their own needs.
There are programmers like this today. They can install Windows and MS Office but once something more complex happened in their Workstation at the office, they'll call Sys Admin to fix it for them.
It's a mix of laziness and doesn't want to know more outside VB/C#. It's for an excuse to get a coffee break, a smoke break, or something else.
I have to agree with you though, knowing English and Mandarin are better (at least for me) than knowing programming languages.