06 December 2006

Whither AI?

I took a few classes on AI back in th '80s, and did some prolog, lisp, expert system, neural net and genetic algorithm work in the late '80's and I've been hooked ever since. (I also recently did work on mobile, intilligent, multi agent systems for which my employer submitted patent applications. But Agents are another blog topic!) Something about software that seems to reason, seems to "analyze and think", derive solutions, adapt, work with incomplete information, refine, etc., is intriguing. I had a professor that said the best form of enterprise computing was batch processing (at that time interative systems such as TSO, CMS and CICS were significantly more expensive than batch). When asked why, he said, "because there is little or no human interaction involved. and people, especially skilled people, are expensive." batch systems processed data unattended eliminating many human jobs (long before the term IT was commonplace, it was known as DP - data processing). Automation was a key objective in Data Processing. I spent many early years in my career automating and maintaining automation. You don't hear the terms automation or data processing mentioned much anymore, no less a key objective.

Unfortunately for automation proponents in the DP days, not everything would lend itself to automation with the technology of the day. Knowledge Workers - workers who had knowledge about the business - were skillled resources who were expensive to develop and to maintain. If only there was a way to automate the thought procesesses of these knowledge workers. This is one of the key business drivers that enabled the AI wave in the '80s.

Essentially, the processing power was not available at a price point to allow AI to deliver on its promise, so the AI bubble burst. I have an issue with the position that "AI failed to deliver on its promise", while it may not have lived up to all the marketing hype (what technology has?) it did deliver much promised value. I know because we dellivered working expert systems, GA, and neural net solutions, however with the "enterprise" thinking of the day ("bigger is better") solutions did not come cheap. Nor were they easy to implement due more to corporate politics than the AI technology of the day..

My first commercial expert system was developed using Borland's TurboProlog (Released during the reign of Philippe Kahn and it is now at PDC) which, I think, was $99 a copy at the time! Now if you had Prolog developers, or hired them, how is this not cost effective?! Noooooo, we had to buy a multi-million dollar expert system monstrocity that probably cost 7 figures to acquire and more to run (with mainframe components) annually, not to mention the runtime fees (anyone else remember runtime fees?!). It wwas also harder to use than writing solutions in TurboProlog. Gotta love those "pointy-headed" bosses and decision makers. That's when you realize some organizations just have too much money to burn. They could have spent a fraction of that using us "knowledge engineers" and TurboProlog to do expert systems. All of us had mainframe experience, we could have made it work with the mainframe or the VSAM files, IMS and DB2 systems - and for much less!

My first neural network application consisted of a bunch of ANSI standard C code shareware and a now-forgotten ANSI standard C compiler, which I used to create a POC. When we did RFPs to neural net vendors of the day, the price tag was $600k+ for development + runtime fees + transaction fees (anyone else remember runtime fees?!). I had a low-cost alternative using NeuralWare (thankfully, who is still going strong! great products IMHO!) for 2 developer seats, and code generation, training for 2, etc. which would have enabled us to do our own neural net solutions for a start-up price of about $20k. When I stated this, my "pointy-headed boss" at the time got angry at me and the fact that my total estimate was about $80k to do the solution in-house. The fact that I was able to demostrate success with shareware was not a welcomed fact either (the POC success was what lead to the RFP to vendors in the first place). It was possible to deliver cost effective AI solutions in 1989-1990, if you didn't have a pointy-headed boss! Why they paid us "Knowledge Engineers" to be Knowledge Engineers and to do knowledge engineering, yet actually "green-light" very little of what we did for production I may never know, but at least I had fun and learned much.

Fast forward to today: AI has really made a strong comeback and is being used in many solutions and many domains. You can find freeware, shareware, open source for just about any AI sub-genre. So if it is so available, where is it in the enterprise? Unfortunately, mainstream enterprise development has yet to make AI a visibible part of their solutions. Short of a few autonomic computing, operations management, BI, analytics and rules engines, very little AI is mainstream (Yes, outside of gaming I mean!). Now we also have the open standards, networks and low-cost compute power to really apply AI in a cost effective manner. Why we don't build intelligent software as a matter of course in enterprise development is another mystery. I think this is due, in part, to lack of awareness. If your CS program even had an AI course or concentration in the last 10 years would be something (I don't folllow current university curricula, so I could be mistaken).

Over the years, I have had the opportunity to mentor younger/junior programmers, designers and architects. The few that I considered to have the "right stuff" (and worth the time) would often ask advice which would often lead to the topic of AI. I encourage those who joined the IT ranks after ~1995 or already knew about Java before graduating, to look at all the old (and new!) books, papers, conference procedings. languages, tools, etc. on AI theory and practice. Many of these sources are freely available on the Web. Be aware, learn AI and apply it to enterprise solution development. The time for mainstream intelligent software is long overdue.

Let me know if you are doing or applying AI in an enterprise, commerciall setting...I'd love to hear about it and hopefully others would too! Whither AI? Everywhere.

05 December 2006

REBOL, REBOL, Waiting for REBOL 3.0!

If you haven't heard of, or tried, REBOL you should have a look. What it does, and how easily it works is just amazing. The size:power ratio is incredible. This is the way much business computing could/should be done. If you are a SMB (small to mid-size business), you should consider REBOL (large enterprises too, but then the beauracracy and politics would probably be problematic because REBOL can actually help your organization!).

I've been dabbling with REBOL for the last few years. Based on my experience with current and prior versions, the next version, 3.0, promises to be even more amazing. So, Carl I'm waiting patiently for version 3.0, please update the timeframe estimates!

I plan more blogs on REBOL in the future. It is a key tool in my personal toolbox. REBOL is different, innovative, simple, elegant and powerful: all things I like in a programming language, and it is more than a programming language.

Disclaimer: I don't work for REBOL Technologies nor have I been compensated to endorse or develop REBOL solutions. (I am interested in REBOL solution development opportunities. If you are a SMB and in need of enterprise software solutions, let me know.

Web 2.0: The new RAD

Web 2.0 is a great concept. If you take what I consider to be the seminal, definitive and diffinitive definition of Web 2.0, a key aspect is the coupling of data and processing logic...an extension of object-oriented concept of encapsulated state and behavior.

Other interesting aspects of Web 2.0 are also similar to aspects of the RAD trend in the 1990's Client/Server era. Many of the Web 2.0 vendors are enabling a web version of RAD - a RAD for Web Applications (RAD4WA) or Rapid Web Application Development.

Even though they dropped support of the Mac version of their tool :( , when it comes to Web 2.0 application development I like ActiveGrid. One of the first things you'll notice about AG is that nearly every artifact is represented by some open standard; the rest are open scripting language artifacts or represented by an XSD. Developing a Web application integrating web services and databases is simple. straightforward and quick.

Another brilliant move by AG is their server model. I can only imagine the discusion that lead to their server design: we don't want a cluster model, we want a grid of low-cost, commodity processing nodes (LAMP stack servers) that will provide enterprise scalability, fault-tolerance, etc. The big vendors are spending huge amounts of money to develop grid / utility computing, what will we do? What does nearly everyone have, or at least have access too? Apache Web Server. We'll write a grid module for Apache. Absolutely brilliant: simple and elegant.

Finally, AG lets you use any/all of a myriad of scripting languages and/or plug-in your own. If you are interested in Web 2.0 or developing Web 2.0 applications AG is worth a look.

Back to RAD...
RAD worked well. We were able to offload processing from expensive mainframes onto low-cost PCs and Unix boxes (where the databases lived). For the most part, the RAD solutions were primariliy data presentation and data entry with input validation (sounds familiar to what most commercial web sites provide, no)?

RAD tools and solutions had their issues and limitations too. They worked well for a certain class of applications where the business logic was not overly complex; they didn't have separation of concerns (e.g., MVC) which was invented and lived nearly exclusively in the Smalltalk world, as did much of the design patterns back then. As complexity rose and maintenance cycles accrued they became very brittle and difficult to maintain. Not to mention they were proprietary.

During part of the RAD era, I worked for a Smalltalk vendor named Digitalk, Inc. Unlike most of the RAD tools of the time, our's was based on Smalltalk. The RAD tool was called PARTS - an instance based RAD programming tool. That is, you build your Smalltalk classes as usual and used PARTS to create instances of any of the classes in your image and knit them together into an application. You could even combine instances into a component (called a Part), that you could save in Parts catalogues and reuse in other applications. Very simple and powerful. PARTS, though standalone too, was part of Digitalk's Visual Smalltalk Enterprise - IMHO the best enterprise C/S RAD toolkit available at the time. What Digitalk VSE was to C/S RAD, I see striking similarities to what AG is to Web 2.0.

It remains to be seen if the Web 2.0 application development vendors can provide more than Rapid Web Application Development and deployment tools. I think some can, are and will. I've picked my winner, what say you?

Web 2.0 is RAD for the Web. I hope it goes better this time around!

Disclaimer: I used to work for Digitalk, Inc. and I have a fond nostalgia for their products. I do not work for ActiveGrid nor do I have any business relationship with them. But if you are hiring or planning for an ActiveGrid project, please let me know. We should talk!