DataSets Suck

First off, a correction. In my recent post on OLTP using DataSets, I gave four methods that would allow you to handle non-conflicting updates of a row using the same initial data state. In reviewing a tangent later I realized that method 2 wouldn't work. Here's why:

The auto-generated Update for a datatable does a "SET" operation on all the fields of the row and depends on the WHERE clause to make sure that it isn't going to change something that wasn't meant to be changed. Which means that option 2 would not only not be a good OLTP solution, it'd overwrite prior updates without any notice. Much better to simply throw a DbConcurrencyException and let the application handle the discrepancy (or not).

Which also answers Udi's question of why it doesn't do that out of the box. It'd be nice if the defaults were implemented with a more robust OLTP scenario in mind, though. It'd be pretty complex, but that's because OLTP has inherent complexities. You would either have to generate the Update statement on the fly (thus breaking the new ADO.NET 2.0 batch option on the adapters) or put the logic at the field level (using an SQL "CASE" statement). I'm not sure how efficient CASE is on the server, but that could potentially fix my 2nd option.

But this brings me to my second and broader point again: the disdain that "real" programmers have for datasets. This was refreshed for me recently on a blog post by Karl Sequin at Code Better. I liked that post a lot (about using a coding test when evaluating potential hires) until I got to the bit about tell-tell signs he would look for. Right at the top?

Datasets and SqlDataSource are very bad

He has since amended that so:

Datasets and SqlDataSource are very bad (update: the dataset thing didn't go over too well in the comments ;) )

and added in the comments:

Sorry everyone...I've always had a thing against datasets...

He's not alone here. It's a common feature of highly technical programmers to hold datasets in contempt. Which would be fair enough if they were willing to give reasons or support for the position. If I felt that such statements came from an informed foundation, there wouldn't be much to quibble about. Unfortunately, too often this is simply not the case.

On those rare occasions when I can get one of these gurus to expound a bit, this attitude generally devolves back to a couple of bad experiences where datasets were used poorly or shoved into a situation where they didn't belong. Indeed, Karl goes on to give the kind of thing he doesn't want to see and I have to agree that he has a point. But while his example uses a dataset, it isn't the source of the problem. The problem is actually in his second point after datasets:

Data access shouldn't be in the aspx or codebehind

Since he's looking for strong enterprise-level coding habits, he's right that it'd be better encapsulated in its own class, and better still in its own library.

Again, it isn't the dataset he actually has a quibble with. He's just perpetuating a prejudice when he reflexively includes them as a first strike. To his credit, he's willing to own up to the prejudice. Unfortunately, he does so in a way that indicates that it is a prejudice he has no plans to explore or evaluate. That's what I hate about the whole anti-dataset vibe in the guru set. Particularly since these tend to be people who are proud of their rationality and expect others to listen to them when they expound on technical topics.

 

Technorati tags: , , ,
23. November 2006 16:47 by Jacob | Comments (0) | Permalink

DataSets and Business Logic

Whoa, that was fast. Udi Dahan responded to my post on DataSets and DbConcurrencyException. Cool. Also cool: he has a good point. Two good points, really.

Doing OLTP Better Out of the Box

I'll take his last point first because it's pure conjecture. Why don't DataSets handle OLTP-type functions better? My first two suggestions would, indeed, be better if they were included in the original code generated by the ADO.NET dataset designer. I wish that they were. Frankly, the statements already generated by the "optimistic" updates option are quite complex as-is and adding an additional "OR" condition per field wouldn't really be adding that much in either complexity or readability (which are both beyond repair anyway) and would add to reliability and reduce error conditions.

My guess is that it has to do with my favorite gripe about datasets in general: nobody knows quite what they are for. I suspect that this applies as much to the folks in Redmond as anywhere else. Datasets are obviously a stab at an abstraction layer from the server data and make it easier to do asynchronous database transactions as a regular (i.e. non-database, non-enterprise guru) developer. But that doesn't really answer the question of what they are useful for and when you should use them.

DataSets are, essentially, the red-headed step child of the .NET framework. They get enough care and feeding to survive, but hardly the loving care they'd need to thrive. And really, I think that LINQ pretty much guarantees their eventual demise. Particularly with some of the coolness that is DLINQ.

Datasets Alone Make Lousy Business Objects

As much as I am a fan of DataSets in general, you have to admit that they aren't a great answer in the whole business layer architecture domain.

I mean, you can (if you are sufficiently clever) implement some rudimentary data validation by setting facets on your table fields (not that most people do this--or even know you can). You can encode things like min/max, field length, and other relatively straight-forward data purity limitations. Anything beyond this, however, (like, say, when orders in Japan have to have an accompanying telephone number to be valid) would involve either some nasty derived class structures (if you even can--are strongly-typed DataTables inheritable? I've never tried. It'd be a mess to do so, I think), or wrapping the poor things in real classes.

One solution to this is to use web services as your business layer and toss DataSets back and forth as the "state" of a broader, mostly-conceptual object. This is something of a natural fit because DataSet objects serialize easily as XML (and do so much better--i.e. less buggy--in .NET 2.0). This de-couples methods from data, so isn't terribly OO. It can work in an environment where complex rules must work in widely disparate environments (like a call center application and a self-serve web sales application) when development speed is a concern (as in, say, a high-growth environment).

I think this leads to the kind of complexity Udi says he has seen with datasets. The main faultline is that what methods to call (and where to find them) are in design documents or a developer's head. This can easily lead to a nasty duplication of methods and chaos--problems that functionally don't exist in a stronger object paradigm.

That Said...

Here is where I stick my neck out and reveal my personal preferences (and let all the "real" developers write me off as obviously deluded): although DataSets make admittedly lousy business objects, most non-enterprise level projects just don't need the overhead that a true object data layer represents. For me, it's a case of serious YAGNI.

Take any number of .NET open source software projects I've played with: not one uses DataSets, yet not one needs all the complexity of their custom created classes, either. They aren't doing complex data validation and their CRUD operations are less robust than those produced automatically from the dataset designer. All at a higher expense of resources to produce.

Or take my current place of gainful employ. We have five ASP.NET applications that all have an extremely complex n-tier architecture--all implemented separately in each web application (and nowhere else--they're not even in a separate library). Each of the business objects has a bunch of properties implemented that are straight get/set from an internal field. And that is all they are. Oh, there's a couple of "get" routines that populate the object for different contexts using a separate Data Access Layer object. And an update routine that does the same. And a create... you get the point. It's three layers of abstraction that don't do anything. I shudder to think how much longer all that complexity took to create when a strongly-typed DataSet would have done a much better job and taken a fraction of the time. It makes me want to call the development police to report ORM abuse.

Which is to Say

Don't let all that detract from Udi's point, though. He's right that for seriously complex enterprise-level operations, you can't really get around the fact that you need good architecture for which datasets will likely be inadequate. Relying wholly on DataSets in that case will get you into trouble.

I personally think that you could get away with datasets being the communication objects between web services in most cases even so, but I also realize that there are serious weaknesses in this approach. It works best if the application is confined to a single enterprise domain (like order processing or warehouse inventory management). Once you cross domains with your objects, you incur some serious side-effects, not least of which is that the meaning of your objects (and the operations you want to perform on them) can change with context (sometimes without you knowing it--want an exercise in what I mean? Ask your head of marketing and your head of finance what the definition of a "sale" is--then go ask your board of directors).

So yeah, DataSets aren't always the answer. I'd just prefer if more developers would make that judgement from a standpoint of knowing what DataSets are and what they can do. Too often, their detractors are operating more from faith than from knowledge.*

*Not that this is the case for Udi. For all he has admitted that he isn't personally terribly familiar with datasets, his examples are pretty good at delineating their pressure points and that tends to indicate that he's speaking from some experience with their use in the wild.

 

21. November 2006 19:23 by Jacob | Comments (0) | Permalink

4 Solutions to DbConcurrencyException in DataSets

Following links the other day, I ran across this analysis of DataSets vs. OLTP from Udi Dahan. His clincher in favor of coding OLTP over using datasets is this:

The example that clinched OLTP was this. Two users perform a change to the same entity at the same time – one updates the customer’s marital status, the other changes their address. At the business level, there is no concurrency problem here. Both changes should go through.When using datasets, and those changes are bundled up with a bunch of other changes, and the whole snapshot is sent together from each user, you get a DbConcurrencyException. Like I said, I’m sure there’s a solution to it, I just haven’t heard it yet.

I thought about this for a minute and came up with four solutions for DbConcurrencyException in this scenario using DataSets (though the first two are essentially the same and differ only by who actually implements it). I'm sure there are others, but this should do for starters.

  1. Use stored procedures created by a competent DBA that utilizes parameters for the original and new column state. This means that you check each field with a "OR (<ds.originalValue> = <ds.updateValue>)". This solution passes the same two parameters per field as an "optimistic" pre-generated update statement but it makes the update statement larger by adding this new "OR" condition for each field.
  2. You can do the same by altering a raw update generated from the DataSet designer. This means sending a longer select to the database each update though this can be offset by setting your batch size higher if you have lots of updates you're sending (uh, you'd need ADO.NET 2.0 for that). I'd hesitate to use this method but that's mainly a personal taste issue than anything else (because I'd prefer using stored procedures and recognize that internal network traffic generally isn't the bottleneck in these kinds of transactions, though on-the-fly statement execution plan creation could be).
  3. Override the OnUpdating for the adapter to alter the command sent based on which fields have actually changed. This is probably the closest in effect to the OLTP solution envisioned by Udi. This solution is problematic for me simply because I've never actually tried to do it and I'm not sure you can hook into the base adapter updates each execution. If you can't, an alternative (in ADO.NET 2.0) would be to create a base class for the table adapters and create an alternative Update function in derived partial classes. In this case, you'd have "AcceptFineGrainedChanges" or some such function that you'd call. Once the alternative base class was created, custom programming per table adapter would be a matter of a couple moments. I've done something similar for using the designer for SyBase table adapters and it worked out pretty well. I'd have to actually try this to make sure it'd work though. Call this two half-solutions if you're feeling stern about it.
  4. This last would be useful if I have a relatively well-defined use case that isn't going to morph much or require stringent concurrency resolution. In this one, you deliberately break the one-for-one relationship from your dataset and database (i.e. one database table can be represented by multiple dataset tables). In Udi's concurrency example, the dataset would have a CustomerAddress table and a CustomerStatus table. Creating the dataset with custom selects would generate the tables pretty painlessly with appropriate paranoia. Now, this only really pushes his concern down a little, making it less likely to be an issue. It doesn't eliminate it. It'd probably handle most of the concurrency problems people are likely to run into. Or at least, push them out beyond where most people will ever experience it (not quite the same thing). It could be taken to a rediculous extreme where each field was it's own datatable (which is just silly, but I've seen sillier things happen) so a little balance and logical separation would be needed.

OLTP may seem more natural as a solution for many, but that's likely an issue of preference and sunken costs (because they've done it before and are comfortable with that solution space). It certainly isn't the only solution, though, nor is it a stumper for datasets.

Finally, I’ll add a caveat that I'm not saying that datasets are necessarily to be preferred over stronger object models. I just know that they get pretty short shrift from "real" developers in these kinds of discussions and want to make sure that the waters remain appropriately muddied. There may be a universal stumper for datasets I don't know about. There are certainly environments where a formal OLTP or ORM tool would be a legitimately preferred solution.

 

Technorati tags: , , , , ,
21. November 2006 05:35 by Jacob | Comments (0) | Permalink

Two Things I Regret

Have you ever been in an interview and gotten some variation on the question "What do you regret most about your last position?" Everyone hates questions like that. They're a huge risk with little upside for you. You're caught between the Scylla of honesty and the Charybdis of revealing unflattering things about yourself.

Still, such questions can be very valuable if used personally for analysis and improvement. In that light, I'll share with you two things I regret about my stay at XanGo. Since I've ripped on the environment there in the past, it's only fair if I elaborate on things that were under my control at the time--things I could have done better.

Neglecting the User

Tim was the Senior IT Manager (the position that became IT Director once XanGo had grown up a bit). He was the best boss I ever had. His tech skills were top-notch (if somewhat "old school"). In addition, he knew his executives and how to communicate with them on a level they understood. It was a refreshing experience to have someone good at both technology and management (and since he's no longer my boss, you can take that to the bank :)).

After a little break-in time as the new Software Development Manager, Tim and I discussed what we needed to do for the organization. Tim's advice was to establish a pattern of delivering one new "toy" for our Distributors each month. He said that the executive board are very attached to the Distributors and that keeping things fresh and delivering new functionality and tools to them each month would make sure that we had enough of the right kind of visibility. Goodwill in the bank, so to speak.

This sounded like a great idea, and frankly, "toy" was loosely defined enough that it shouldn't have been a hard thing to do. It turned out to be a lot harder than expected, however. In my defense, I'll point out that we were experiencing between 15% and 20% growth per month and that we had done so since the company had started a year and a half before. That growth continued my entire tenure there. Now, if you've never experienced that kind of growth, let me point out some of what that means.

First off, using the Rule of 72 (the coolest numeric rule I know) will tell you that we were doubling every 4 to 5 months (in every significant measure--sales, revenue, Distributors, traffic, shipping, everything).

In case you've never experienced that kind of growth, it feels like ice-skating with a jet engine fired up on your back. Even good architecture will strain with that kind of relentless growth. When this happens, you become hyper-vigilant for signs of strain. This vigilance has sufficient reality to be important to maintain. Unfortunately, it also makes it easy to forget your users.

Developers like to live in a pristine world of logic and procedure. Unfortunately, life, and users, aren't like that. If they were, there'd be less need for developers. Users don't see all the bullets you dodged. They take for granted the fact that a system originally designed for a small start up is now pulling off enterprise-level stunts. They don't see it, so it doesn't exist. It is very easy to get caught up in the technology and forget that often it is the little touches that make your product meaningful. Sometimes the new report you spent an hour hacking together means more than the three weeks of sweating out communication with a new bank transaction processor. And by means more, I mean "is more valuable than".

Not that you can afford to neglect your architecture or needed improvements to sustain the needs of the company and prepare for foreseeable events. If you ignore that little glitch in the payments processing this month, you have really no excuse when it decides to spew chunks spectacularly next month.

What I'm saying here is that you have to balance functionality with perceived value. You have to know your users and their expectations because if you aren't meeting those expectations, no amount of technical expertise or developer-fu is going to help you when things get rough. In the case of XanGo, I could have afforded to ease up on the architecture enough to kick out monthly toys for the users. Yeah, some things would have been a touch rockier, but looking back there was room for a better balance.

Premature Deprecation

When I arrived at XanGo, our original product (a customized vertical market app written in VB6 on MS SQL Server) was serving way beyond its original specifications. We'd made some customizations, many of them penetrating deep into the core of the product. Our primary concern, however, was the Internet application used by our Distributors in managing their sales. We spent a month or two moving it from ASP to ASP.NET and ironing out bugs brought on by the number of concurrent users we had to maintain. We also removed the dependence on a couple of VB6 modules that were spitting out raw HTML (yeah, I know. All I can say is that I didn't design the monster).

Anyway, after that was well enough in hand, we gave a serious look to that VB6 vertical market app. Since VB6 wasn't all that hot at concurrent data access and couldn't handle some of the functionality we were delivering over the new web app, we decided that it should be phased out. Adding to this decision was the fact that we had lost control of the customizations to that app and what we had wouldn't compile in its present state.

Now developers (and for any management-fu I may have acquired, I remain a developer at heart) tend to be optimistic souls, so we figured "no big", we'll be replacing this app anyway. And we set to work. Bad choice. In a high growth environment, the inability to fix bugs now takes on a magnified importance. Replacing an application always takes longer than you expect if only because it's so easy to take the current functionality for granted. Any replacement has to be at least as good as the current application, and should preferably provide significant, visible improvements.

The result of this decision was that we limped along for quite some time before we finally came to the conclusion that we absolutely must have the ability to fix the current app. We paid a lot of political capital for that lack. In the end, it took a top developer out of circulation for a while but once it was done, it was astonishing how much pressure was lifted from Development.

It's the Users, Stupid

No, I did not mean to say "It's the stupid users." When it comes right down to it, software exists to serve users, not the other way around. As developers, it is easy to acquire a casual (or even vehement) dislike of our users. They are never satisfied, they do crazy stuff that makes no sense, and they're always asking for more. It's tempting to think that things would be so much better without them.*

I got into computers because I like making computers do cool stuff. Whatever gets developers into computers, though, it's a good idea to poke your head up periodically and see what your users are doing. Get to know who they are. Find out what they think about what you've provided for them. Losing that focus can cost you. Sometimes dearly.

*I think one of the draws of Open Source is that the developer is the user. It's also the primary drawback. But that's a post for another day.

 

17. November 2006 20:10 by Jacob | Comments (0) | Permalink

Blogging Software Update

Well, I still haven't chosen what blog software I want to use in a new home. So many to chose from and none are a perfect fit. The comments left by Scott Hanselman, Phil Haack, and Dave Burke were good ones and point out the connectedness of the blogosphere (is that word accepted enough to forgo the quotes now?)

Since I gave DasBlog such short shrift in my original post, I spent some quality time with it. DasBlog has some really strong points in its favor. For one, it has a very active developer community, though that isn't as apparent as it is with Subtext. DasBlog 1.9 was released today and I'm even mentioned in the release announcement (it's always nice to see your name in print--even virtual print).

I am leaning towards DasBlog now. Two bonuses of using it: no tricky database settings (it uses XML files stored in the web file system), and a really flexible plug-in system (they call them macros, but they're actually compiled .NET assemblies with a standard interface and config-file usage).

Not that I've decided to use it. Or not. I'm not generally this indecisive. I think...

 

Technorati tags: , , , ,
22. September 2006 18:38 by Jacob | Comments (0) | Permalink

More Validation

It's always nice to have your opinions confirmed by someone you respect. Joel Spolsky's latest series on recruiting developers has a final section that includes essentially the same point I made a couple months ago: that specific technologies aren't as important in hiring developers as general savvy is. He even issues the same caveat that you do need domain experts for leavening. Nice. As is usual with Joel, he includes a thorough analysis of the issue he explores so there's a lot of good rumination there.

 

8. September 2006 09:59 by Jacob | Comments (0) | Permalink

Professional Integrity

Lidor Wyssocky has some good thoughts on why it is that developers don't implement changes that they know would be helpful.

The problem is that although we know exactly what doesn’t work right and how it should be fixed, most of us will never say anything. We don’t say anything because there’s a very good chance the minute we do we will be marked as uncooperative, pessimistic, or simply detached from the business reality. (emphasis in original)

He concludes with his call to action.

If more of us say what we know in our hearts to be true, the rest won’t be able to ignore it anymore. Hopefully.

And I agree with him. What he is talking about is an aspect of professional integrity. If you are a professional that a business relies on for good decision making, then you need to act the part and provide reasonable suggestions wherever they may do good.

But there exists both a flip-side and a problem with Lidor's points that are inherent in professional integrity that need to be explored.

The Flip-side

While you avoid negative attention by remaining silent about needed process change, you can gain positive attention by supporting whatever has caught the eye of your executives. This is a major contributing factor to "The Silver Bullet Syndrome". At XanGo for example, we had one company convince management that they could cut development costs to 10% of their current level. Not 10% off, mind, that's 10% of current levels.

Despite all logic to the contrary (I mean, think about it--if a product could actually deliver cost savings of 90%, every company on the planet would be using it--they'd be incompetent not to), XanGo ended up spending millions of dollars and wasted a full year of development on this silver bullet. What was most fascinating to me were those who were willing to endorse this silver bullet. Those people were promoted to team leaders and project managers. Any who would oppose the new development project were either removed before its introduction or told (in at least one case explicitly) that negative feedback would put their jobs in jeopardy.

In this way, technical people who should have known better ended up perpetrating the problem not by staying silent but by actively promoting bad solutions. They were rewarded for doing so. Here's the thing: in most cases, there really is no price to pay for backing a losing silver-bullet solution. Oh, those in front of the board have negative exposure when it all blows up. Some of the higher management tiers are at risk as well. But for most of IT, there simply is no bill to pay for being in the pack.

The Problem

Here's the thing: having Professional Integrity carries not-insignificant risks.

  1. Having integrity makes you predictable. Those with an agenda can plan their actions confident of what you will do. That means that if things get nasty, you end up following Marquess of Queensberry rules in a bar fight.
  2. Your compensation is in the hands of ignorant people. They don't know what you know. Many of them may be smarter than you. Even if they aren't, though, they are still the ones signing the paychecks.
  3. There is seldom a single, obvious best answer. Honest and rational people can arrive at differing conclusions based on the same information (because they weight aspects of the problem domain differently). This can easily split the message received by management and open the door for bad solutions. This also leaves you vulnerable to the manipulative because they can confuse the issue at need.
  4. Costs of bad solutions are often deferred while the costs of changing minds is immediate. Whether opposing a bad proposal or proposing a better process you are asking for change and that makes people uncomfortable. If you make people uncomfortable they will often hold you responsible for their discomfort.
  5. Sometimes having professional integrity means looking for a new job. Whether you are fired or are forced to resign, some positions are simply incompatible with maintaining professional integrity. These situations are rare, but they can happen.
  6. People aren't rational. As a result businesses often make choices that seem irrational. Sometimes those choices are right even though they are irrational. Professional Integrity in such a situation is problematic at best.

As I see it, Lidor's call to arms makes sense but is unlikely without more support. The thing is that the problem is bigger than he has acknowledged. He is asking us to take actions that are overtly risky and he hasn't given us enough of a reason to do so. I mean, the emperor may not have any clothes, but there's no telling how he'll react to you pointing that out.

The Solution Domain

"Because it's the right thing to do" resonates with most of us, I think. But it is important to acknowledge that, at heart, Lidor's plea is a moral and not a technical one. As such, it needs to be approached from a social, not a technical standpoint. The thing is, we are equipped to make this happen, but we need to explore it in the right context if we hope to make progress.

Societies have a lot of experience coercing correct behavior. There are a lot of tools available, some better than others. My preferences tend to flow from my LDS background. As such, I'd advocate proselytizing correct procedures. We should explain as clearly as we can the benefits of acting correctly. While the payoffs of acting correctly are long-term, they exist nevertheless and are provable and measurable. Professional Integrity in IT has similar long-term value (above and beyond correct practices) to businesses and those who learn to identify and value it will gain the benefits it brings. A company that hires professionals with integrity will do better than companies that hire similarly talented people who lack that dedication.

As long as we're borrowing from religious tradition, lets hit up the Catholics and/or Jews (leaving no stereotype unturned) and apply some well-earned shame. I know who sold us out at XanGo. My reaction to them in the future can (and should) be informed by their actions there. I think we do ourselves a disservice if we know compromised IT professionals and we don't tell them our opinion of their actions or make those opinions public. Repentance (more religious terminology, but applicable) should be possible, certainly--people can change and reformation should be encouraged. That said, it's important that forgiveness be contingent on honest efforts towards reparation and change.

On a more general level, ostracizing those who behave badly is one of society's greatest tools to restrain bad behavior. The next time I review resumes to evaluate new hire candidates, I'll have an eye out for those I know who, while they may be bright and talented developers, ended up endorsing those things that were bad for the business. More generally, I'll try to craft questions that explore a candidate's professional integrity in addition to their technical expertise. Reputations in IT should be more than merely technical. Professional Integrity is important not just to businesses but to the perception of IT as a whole. The IT crash in 2001 contained a lot of payback for abuses in IT during the build-up of the bubble. While we were on top many of us made unreasonable and expensive demands on companies (some personal, some technological) and those abuses weakened us as a group.

Collective bargaining is one route that I'm sure will come up--a union or standards body that enforces "correctness". Personally, I'm not a fan of that option. Bureaucracies are inefficient and tend to be in opposition to business interests. We do not need more antagonism from the executive suite. Unions also have a tendency to groupthink that I consider inimical to technology innovation.

I'm sure there are others that I am overlooking. I'd love to hear further thoughts in the comments. Please leave 'em if ya got 'em. Don't make me beg.

I acknowledge that my thoughts here will be hard to implement. They require conscious short-term effort with no promise of reward. It's certainly beyond the scope for a single individual to affect. Still, I believe it's possible to make things better than they are right now and that the potential for reward is both real and achievable. Plus, Paladins are made for exactly this kind of idealistic fight for what's right...

 

28. August 2006 21:23 by Jacob | Comments (0) | Permalink

An Object Oriented Beating

I wish that I could take credit for this, but I can't. The fact of the matter is that one of my developers created this and sent it to me one day she was particularly frustrated. It isn't her fault that it's in VB, that was our environment at the time. I take it as a badge of honor that it was sent to me personally, rather than being sent to the rest of the team (don't get me wrong, the rest of the team were copied, but though I was the target, it was obvious that I was trusted to get the joke and now it's probably all ruined by me overanalyzing it and anyway, I pass it on for your viewing pleasure)...

Jacob's Object Oriented Beating!

Private Enum Level
    Wuss
    Ouch
    #!$%()
    OMG
    Blackout
End Enum

Private Function CyberBeating(ByVal Frequency as Int) as Pain
Dim Intensity as Double
Dim Maximum as Double
Dim Jacob as Object

Maximum = Level.Blackout
Intensity = Level.Wuss

    Do While Intensity < Maximum
        Jacob.Beat(Frequency, Intensity)
        Intensity += 1
    Loop
    Return Intensity

End Function

CyberBeating.Execute

 

 

Technorati tags: , ,
26. July 2006 03:24 by Jacob | Comments (0) | Permalink

Experienced Developers

The following applies mainly to in-house business software development. It might or might not apply to ISV or other product development houses. I think that there's room for broad application, but you can hit my list of software blogs if you want some quality sources for more generalized ISV or product development exploration.
Back when I was looking to hire developers a couple years ago, I knew some programmers that I wanted on my team. I had worked with them before and knew what they were capable of. Unfortunately, they didn't have much experience with .NET--our platform at the time. I made the case then (to my boss and HR) that a good developer is a good developer and the language and framework are more or less immaterial for people with a broad set of experience. The syntax and capabilities of the language and environment are teachable and a developer's broad experience helps them pick it up quickly.
I was fortunate enough to bring one of those developers (who is currently working in indie games) onto my team. The experience was great. I was right that he was able to come up to speed and pick up the nuances of the environment very quickly. Further, he was able to solve some issues outside of the framework in ways that saved us a lot of time (using Python for text file manipulation for example).
So does this mean that you can toss out environment-specific skill sets when evaluating who to hire? Not in my opinion. You see at that time we already had a core group that was intimately familiar with the capabilities of .NET. Domain-specific knowledge is crucial in producing software applications that actually do what they are supposed to do in a reasonable timeframe. People who have already digested the learning curve are invaluable in blazing the way for others who haven't yet done so. The reason specialists are crucial is simple--somebody who has already walked the trail is able to point out the pitfalls and shortcuts and help the whole team move quickly. The new guy was able to come up to speed so fast because he had people he could ask when he had questions and who could explain why things worked the way they did.
I came out of this experience with respect for having a diverse programming team. You have to have at least some specialists who know the environment inside and out. These specialists are going to be the limit of what you can accomplish in a lot of ways so they need certain skills--chief among them, the ability to communicate with other developers (because they'll do more of that than they probably want to and because a specialist who can't communicate isn't able to influence the project nearly as well as one who can).
But I personally don't think it is a good idea to have a team or department composed solely of specialists. Any software development that requires more than two people is going to run into problems in a wide array of domains and having a broad skill set to draw on can be very useful. What proportion of specialists to generally kick-ass developers do you need? Well, that's a good question. I welcome ideas in the comments, but I suspect this is one of those things that is highly contextual and thus at the heart of the "art" of software development. I'm thinking that a narrowly defined ISV benefits with a higher ratio of specialists than, say, a multi-level marketing company looking to manage their necessarily custom business needs.
25. July 2006 19:30 by Jacob | Comments (0) | Permalink

Calendar

<<  September 2014  >>
MoTuWeThFrSaSu
25262728293031
1234567
891011121314
15161718192021
22232425262728
293012345

View posts in large calendar