Devs the New Superheroes and Rockstars? What About IT?
With the proliferation of smartphones and touch coming of age in the mobile and desktop world, we’re really starting to border on Minority Report territory. Applications are an end user’s interface into the online world, and they create the experience. Some are good, some are horrendous. We’ve all seen and used bad applications over the years. As a backend kind of guy who associates himself more in the IT space overall, it is all about the end user and/or the business. It’s something I have to remind some folks when I’m onsite doing work for them. We support them. Having said that, a few things lately have raise my dander – namely Code.org and this blog post (which I saw via Grant Fritchey’s blog post). Why do these bother me?
Let me say this up front: communication is a fundamental issue in many organizations and there’s enough blame pie to go around. Politics and other road blocks can slow any process down causing frustration everywhere. Poor communication and broken processes hurt more than help. However, process and communication should NOT be difficult. Implemented properly, they should be easy and transparent. Things like change management and source control are fundamental concepts in both IT and development, and should enhance things like high availablity. To me, they are cornerstones and core tenets of an overall availabilty strategy. However, how many of you DBAs use source control … go ahead … I’ll wait. In my experience, that number is very low. In a similar fashion, change control and things like testing are often viewed as a nuisance to developers and pushing direct to production can often lead to many headaches for DBAs. We annoy you, you annoy us; perfect recipe for a standoff.
I don’t disagree with the fundamental premise of Code.org. I remember getting started on a Commodore PET and then a 64 with things in BASIC like:
10 PRINT “HI”
20 GOTO 10
Even us IT guys need some programming skillz(tm). (See what I did there? I’m pandering to the hipster crowd.) Things like PowerShell or our very own Transact-SQL (T-SQL to those “in the know”) in SQL Server need you to understand at least fundamental programming concepts. Heck, there’s even WMI and SMO (among other things) if you really want to get adventurous in the Windows/SQL Server world. The longer video Code.org made is cute when the ask the kids what they want to be and when they asked them do they know what a computer programmer is. Most little kids want to be firemen, athletes, princesses, etc. That makes sense. Being a programmer is way too practical. I mean, you don’t see anyone at the age of 5 asking to be a sanitation worker or insurance broker, do you? I’m sure there are some kids, but they’d be a reaaaaaaaaaaaaaaaaaaaaally small percentage. Kids are allowed to be dreamers. Adults need to pay bills and live in reality most of the time.
But that got me thinking – why is it OK to glamorize just programming in the computer realm? Everything else is as, if not more, important in terms of the day-to-day running. Do you think Amazon.com runs on a single web server and only one database server? Heck no! Both the application they have and the backend are designed to scale and be available. If you’re going to teach programming, kids need fundamental IT basics, too. Things like understanding about backups would serve them well even in their daily lives. How may times to we have to hear things like, “I lost all of my photos on my external hard drive/phone and I didn’t have them anywhere else.” People’s lives are wrapped up in digital.
As I said, apps are the gateway for people. Be it a browser or one on your phone, it makes sense to say that programming and computers should be a fundamental part of one’s education curriculum earlier in life. That part I don’t disagree with. But what they are not telling you about is how to make it all work – the full view of the application lifecycle. It’s one thing to write an app. It’s another for it to work well and perform. Many of us who are consultants would have less opportunities if applications were written properly and scaled and supported what they needed to. I see way too often that applications are often the barrier for upgrades in many environments, leading to many different – and hard – problems to solve as time goes on. How many applications in your environment – third party or custom – support SQL Server 2008 or later? I bet not all do. Or if they did, your company won’t uprgade to it, forcing you to stay on an older version of SQL Server for other reasons like cost … and this puts not only your environment at higher risk as time goes on for issues, but your skills, too. SQL Server 2005 is now nearly 10 years old and four (4) major versions old. That’s like dog years in the technology world.
Developers also make false assumptions – like availability is only IT’s problem. It’s not. There’s stuff they need to do, too, yet they will blame us for their application woes when the application barfs after failover. Too bad, so sad. Now we all live with the pain of your stupid decisions and blinders that you need to be part of the solution from day one, not part of the problem. This is true whether you are using SQL Server, Oracle, or anything else. DBAs are often the last to know about an implementation – and thus, take a lot of heat and blame if/when things go wrong. DBAs need to be in on the planning from day one.
The thing that got me the most was the graphic in the NoDBA post – namely the bit where he has “Heroic Developers” and automatically associates things like bureaucracy and delays with data management. Huh? Sure, DBAs can be a pain in the tuchus. So can devs. Or network admins. Or storage admins (DBAs never have problems with storage folks, right?). You get the point. As I mention, process is a necessary evil for things like availability. In his post, Martin Fowler does acknowledge that one of the negatives of the dev to prod approach can be something like “bypassing DBA groups may also mean bypassing operations groups that know how to keep valuable data backed up and secure”. Amen. But I still won’t allow a dev in production if I can help it.
The bottom line: in the real world, we’re all part of the solution, and may even be part of the problem to someone else. There’s a reason in most cases devs should not just push code – let alone untested code – out to the world. But IT also needs to be more aglie than it has been. That’s one of the things we do here at SQLHA – we help get organizations up to speed and during implementation, do not implement crippling process that does not work for anyone. This is one of the reasons why virtualization and concepts like the private cloud are taking hold – it’s a more agile deployment than procuring hardware for every new deployment. It doesn’t make it right or wrong, but it changes the dynamics. Smartphones and apps have changed the dynamics of even rapid application development. IT unforunately hasn’t always caught up to meet that demand.
It’s time for everyone to grow up. Kids should learn about computers right along with math and other key things, devs and developing applications is important, but so is IT. You really can’t have one without the other, so let’s find a better way to work together and start earlier. That way we can have less finger pointing later on. Deal?
well, it’s really about the App model. Trying to generate lots of people making these little “apps” that consume Microsoft, Facebook, and other “services” – which translates into usage, and revenue.
It doesn’t take much to build a “little” app that MIGHT be a hit. Mythical Man Month talks about going from a Program to a real application. This is all about getting the minions to create these little apps. Then, worry about the science of making sustainable, hardened, supportable, reliable, performant, etc. stuff later….