I have spent couple of hours today trying to pay a telephone bill via the telephone provider's portal and these are the issues I faced:
- The telephone provider's portal has a prominent login button (with Signup) and signing up here/logging in here does not enable me for all services. For example, I can view the Value added services but not bill payment although 'Bill Payment' is UI-linked very close to the login button.
- Click on actual 'Bill Payment' link and I'm taken to a different site with the standard warning 'You are moving out of the trusted site with secure credentials...etc' and I wonder what has happened since I didn't log out.
- Next, I go to the real bill payment portal and from the URL I can see it is a different hosted site and I register/login once again. The password rules are different here (length of this password has to be 8 characters) while my earlier password to the apparently same portal had different rules with length/alpha numeric constraints. Nothing new here, so I proceed.
- Once I successfully login my Bill Pay portal, the bill is promptly put up (good integration with the database!) and then the next interface takes over. It looks like some kind of web service which displays all the banks that are approved with visa/master/diner card options and I proudly click my NetBanking.
- And now is the finale! I get an exception -" apache error - string out of length-10".
I forgive the site thinking perhaps the session didn't handle it well and again I go all the way back to my first page, login two times with different credentials and wait for the page to come up. With bated breath!
- Now, a different exception comes up - "Fatal error: Allowed memory size of 125829120 bytes exhausted".
- Now my son, aged 12, gives me an idea! Do it really fast - keep clicking all options as fast as you can and beat the time-out error. I laugh at that! What can a user do now other than perhaps dance in front of the computer to keep it in good humour.
I'm not sure if the Bill Payment test engineers tested much, but this is what anyone should test which I christen 'common sense tests'.
1. Estimate the number of users who would hit the site on an average and maximum (bill payment date). If there are 50,000 subscribers who appear in the log for a month, it is safe to assume 70% of the subscribers as maximum. I know that most companies don't have the luxury of affording a commercial testing tool that can simulate thousands of users and using an 'evaluation version' of a load tester does NOT give the actual picture at all. The results of the evaluation versions are not reliable and one has to find ways of simulating the no. of users by other means.
2. No. of users, no. of concurrent users, geographic distributions have to be tested. For example, Asia has about 300,000 million users while it progressively decreases for Europe, North America and becomes miniscule for Africa and Middle East. It is the not the % of Internet users that matter but the actual volume.
Account for latency tests from those demographic regions.
3. While ideally you will get marketing data about user behavior, how long a user's 'eye balls' are stuck to your landing page or how many times the same user might visit a typical category of web site, how many of us can afford the Forrester or Gartner report in terms of money or time? Rely on the good old fashion logs instead. There are open source based OLAP tools where you can create simple dimensions that you want to measure and run the logs through them. You will get a fair indication about user behaviour. If your company has a Tools Team, ask them to analyze for you. Base your tests on the data you get for your portal.
4. More often than not, like my Bill Pay application, each app is integrated (SOA) with another provider either providing a service or consuming or both and there will be standard SLAs for capacity and volume on both sides besides protocols, security measures etc. Check to have tests to see if the SLAs are defined as required by your business. It should have the right capacity planning.
5. Look for those pages or links or queries that are most accessed by users. A 'Search' or 'My Orders' is more critical than 'Customer Case Studies' from a performance point of view. Test and see if all parts of the portal have uniform performance problems or not. In technical papers, an often quoted line is this 'Run a mix of processing patterns and check the limits of infrastructure' :-). How's that for jargonizing!
6. Check for functional reliability - this means whether it is one user or thousand of them they can 'feel' the same for accuracy, security and ease of use and not broken sessions. I saw a hilarious note from a blog ( http://blogs.cio.com/node/228) that specified some rules that could affect functional reliability.
All traffic is encrypted. All fields that display sensitive information are invisible, unless you move the mouse pointer over it, and click (hold the click to see the info). All screen savers are locked on blank screen (no user customizable fancy-dancy screen savers) - and set at 1 Minute, maximum - no user ability to change / reset this. All user systems have USB disabled, no CD-ROM drive and no floppy drive. All passwords must be a minimum of 8 characters long, have at least 2 numerics, 2 symbols, 2 capital letters and 2 lower case letters. Zero repeat characters and no character can be used in the same position more than once in 16 months. Passwords must be reset every 28 days - no exceptions.
7. Last but not the least, examine whatever channels you have access to for 'unintended consequences'.
How a user did use it not to hack the system but to actually use it in some ways. When I got the string length exception in my portal, I wanted to somehow pay my bill. I googled for the error that said it happens with long named attachments. So, I set about looking for clues in the http path to avoid that. I couldn't hack it but what I'm saying is that a tester should observe a hacker/unintended user's behaviour and convert that to a test too.
Now, I sympathize/empathize with my Bill Payment portal. It was very kind to me! Tomorrow I will physically go and pay the bill!
Thursday, January 18, 2007
Tuesday, December 19, 2006
Product Success Metrics
This is a general checklist about what to measure in order for us to say a product, as in a 'software product' succeeds. There could be various assumptions across the industry which finally boils down to these metrics:
Customers new and old
How many new customers are you acquiring per month? How many customers have canceled maintenance or subscriptions per month? This is not a direct metric of the product itself since it has other parameters such as Sales efficiency, Time to market etc. However, measuring this metric is key to product success. Even if customers are still 'in' and do belong to the client list, the degree of engagement is important to be measured.
Quality!
Yes :-) this is an often heard term but product quality means so many things to so many people.
Product quality should be monitored on an ongoing basis to make sure it doesn't become a serious problem. Set up a tracking mechanism with either weekly or monthly statistics with not just the number of defects but quality of the quality process itself. I have seen product heads just track the traceability from requirements to defects and nothing more. A well tracked quality system produces just the right metrics which would exactly show the quality health of the product.
For released products, technical support typically keeps track of open defects. As soon as a product is released, it is common for the number of defects to go up as more people use the product. If the quality is high, it should be a manageable number and should settle down after awhile. For unreleased products, QA and release management should maintain a strict vigilance on the numbers. If the number of open defects is not going down as the release date approaches, the release date is likely to slip. Obviously we want the Severity 1 and Severity 2 defects approaching zero as the release date approaches. Severity four "defects" are typically enhancement requests and are often ignored in terms of product quality. Yet they indicate areas where the product fails to satisfy the customer. Product quality is the best indicator of internal health of the product;
Technical Support
Product Management and Technical Support should work at tandem. Care should be taken that metrics are not tweaked for a lesser benefit. I have seen Customer Support increase no. of calls they attended by logging the same problem in different ways and product management trying to respond to multiple problems with a single statement of direction or FAQ that would portray them as having anticipated that as a known problem.
Technical (or Customer) Support already measures the number of calls per product and the nature of the calls. Product management should analyze these numbers to identify areas of that product that can be improved for a better customer experience. Are there problems with installation of the product? Is the documentation too hard to understand? Is the product too hard to use? By streamlining a process, will you cut down the number of technical calls? Review the number and type of calls for your product to uncover hidden profit leaks.
Product Schedule, scope and slippage
How long does it take from the time you define requirements until you have a finished product in the market? Keep track of the percentage of original requirement specs that get delivered as the final product. How many "out of scope" requirements were included? Do your developers have a clear idea of what problem to solve and for whom? The problem may be poor requirements definition, which you can control, or it may be scope creep which you need to communicate to your management. Report the facts and let management manage.
Pre-Sales and Sales' support
How many non-sales people does it take to support the sales cycle? It would be a great metric to know how much time is being spent by non-sales people on direct sales efforts such as demo support, conference calls with prospective customers, onsite presentations, requirements assessment, and other sales calls. Early in a company's life, people wear many hats and this is a common thing but as a company grows, product selling needs to be 'empowered' with the right sales kits and training. Sales personnel need to continously update themselves as opposed to just having client meetings and playing golf.
When non-sales people participate heavily in the sales process, they are not doing their day job. And the cost is rarely attributed to the sales activity; another hidden profit leak.
Ultimately, we want to build and sell products profitably.
Product revenue/profit
How do you know which products are worth investing in and which should be retired?
Sales should be able to tell you how much revenue is generated per product. In order to calculate profit, each discrete product (not product line) has a profit and loss statement tracked in accounting. Finance usually has these numbers but it is Sales' responsibility to provide the profitablity numbers to the top management.
Market share
What market share you have. By tracking this metric over time (year over year), you can discover whether the market is saturated or ripe for expansion. If the market is growing and yet your market share is shrinking, it indicates that another competitor is growing at a rate faster than you are. Getting this metric is the hardest among all metrics and not all survey agencies who do this provide reliable data for Sales. Many standard business analysts provide reports for the vertical industry itself such as Forrester, Gartner etc but again, each company has to investigate somethng that works best for them. And management must have the guts to accept the facts.
ROI on Marketing
There is a need for a closed loop lead tracking system that allows you to track activity from the marketing program that generated the lead all the way through to a closed deal. Are the right market segments being targeted? Do the positioning and market messages get a buy in with the buyers? Do you have a compelling solution? Is that packaged as a compelling solution?
And finally....
It's not possible to track all metrics at once, but determine the key performance indicators you need to start tracking. Report the baseline, begin tracking on a periodic basis, graph the trends, and drill down to find out what is going on.
Start with the metrics for customers and product quality. These are the first ones to tackle.
Calculate the profitability of your product. If you have a loss leader, can you justify further investments? Do win/loss analysis to find out whether your product is driving business, even though it isn't profitable. If it should be retired, show some leadership and present the facts about why it should be retired and how you would redeploy the resources in a more profitable way. There's noplace for personal attachments here! And no place for vested interests if you know what I mean.
Customers new and old
How many new customers are you acquiring per month? How many customers have canceled maintenance or subscriptions per month? This is not a direct metric of the product itself since it has other parameters such as Sales efficiency, Time to market etc. However, measuring this metric is key to product success. Even if customers are still 'in' and do belong to the client list, the degree of engagement is important to be measured.
Quality!
Yes :-) this is an often heard term but product quality means so many things to so many people.
Product quality should be monitored on an ongoing basis to make sure it doesn't become a serious problem. Set up a tracking mechanism with either weekly or monthly statistics with not just the number of defects but quality of the quality process itself. I have seen product heads just track the traceability from requirements to defects and nothing more. A well tracked quality system produces just the right metrics which would exactly show the quality health of the product.
For released products, technical support typically keeps track of open defects. As soon as a product is released, it is common for the number of defects to go up as more people use the product. If the quality is high, it should be a manageable number and should settle down after awhile. For unreleased products, QA and release management should maintain a strict vigilance on the numbers. If the number of open defects is not going down as the release date approaches, the release date is likely to slip. Obviously we want the Severity 1 and Severity 2 defects approaching zero as the release date approaches. Severity four "defects" are typically enhancement requests and are often ignored in terms of product quality. Yet they indicate areas where the product fails to satisfy the customer. Product quality is the best indicator of internal health of the product;
Technical Support
Product Management and Technical Support should work at tandem. Care should be taken that metrics are not tweaked for a lesser benefit. I have seen Customer Support increase no. of calls they attended by logging the same problem in different ways and product management trying to respond to multiple problems with a single statement of direction or FAQ that would portray them as having anticipated that as a known problem.
Technical (or Customer) Support already measures the number of calls per product and the nature of the calls. Product management should analyze these numbers to identify areas of that product that can be improved for a better customer experience. Are there problems with installation of the product? Is the documentation too hard to understand? Is the product too hard to use? By streamlining a process, will you cut down the number of technical calls? Review the number and type of calls for your product to uncover hidden profit leaks.
Product Schedule, scope and slippage
How long does it take from the time you define requirements until you have a finished product in the market? Keep track of the percentage of original requirement specs that get delivered as the final product. How many "out of scope" requirements were included? Do your developers have a clear idea of what problem to solve and for whom? The problem may be poor requirements definition, which you can control, or it may be scope creep which you need to communicate to your management. Report the facts and let management manage.
Pre-Sales and Sales' support
How many non-sales people does it take to support the sales cycle? It would be a great metric to know how much time is being spent by non-sales people on direct sales efforts such as demo support, conference calls with prospective customers, onsite presentations, requirements assessment, and other sales calls. Early in a company's life, people wear many hats and this is a common thing but as a company grows, product selling needs to be 'empowered' with the right sales kits and training. Sales personnel need to continously update themselves as opposed to just having client meetings and playing golf.
When non-sales people participate heavily in the sales process, they are not doing their day job. And the cost is rarely attributed to the sales activity; another hidden profit leak.
Ultimately, we want to build and sell products profitably.
Product revenue/profit
How do you know which products are worth investing in and which should be retired?
Sales should be able to tell you how much revenue is generated per product. In order to calculate profit, each discrete product (not product line) has a profit and loss statement tracked in accounting. Finance usually has these numbers but it is Sales' responsibility to provide the profitablity numbers to the top management.
Market share
What market share you have. By tracking this metric over time (year over year), you can discover whether the market is saturated or ripe for expansion. If the market is growing and yet your market share is shrinking, it indicates that another competitor is growing at a rate faster than you are. Getting this metric is the hardest among all metrics and not all survey agencies who do this provide reliable data for Sales. Many standard business analysts provide reports for the vertical industry itself such as Forrester, Gartner etc but again, each company has to investigate somethng that works best for them. And management must have the guts to accept the facts.
ROI on Marketing
There is a need for a closed loop lead tracking system that allows you to track activity from the marketing program that generated the lead all the way through to a closed deal. Are the right market segments being targeted? Do the positioning and market messages get a buy in with the buyers? Do you have a compelling solution? Is that packaged as a compelling solution?
And finally....
It's not possible to track all metrics at once, but determine the key performance indicators you need to start tracking. Report the baseline, begin tracking on a periodic basis, graph the trends, and drill down to find out what is going on.
Start with the metrics for customers and product quality. These are the first ones to tackle.
Calculate the profitability of your product. If you have a loss leader, can you justify further investments? Do win/loss analysis to find out whether your product is driving business, even though it isn't profitable. If it should be retired, show some leadership and present the facts about why it should be retired and how you would redeploy the resources in a more profitable way. There's noplace for personal attachments here! And no place for vested interests if you know what I mean.
Tuesday, November 28, 2006
Ensuring product excellence
By 'product' I mean a typical software engineering product within the IT industry. If we look at purely from the experience point of view it is not rocket science to achieve product excellence. However there are many ifs, buts and loose ends to take care of. Where do the loose ends lie?
Usually, there is a 'Vision' team and there is an 'Execution' team. Care should be taken that people with the right skills should be fitted within the right teams. For example, someone with very high qualifications from world's leading technical institutes may not necessarily be part of the vision team. It might be a worthwhile exercise to conduct an aptitude fitment test such as MBTI test. See who qualifies to part of the Vision/Strategy team. If we follow the balanced score card method of evaluation, the score table will point to those people who understand the industry dynamics, who have continuously tried out other products in the same league, understand the strengths and weaknesses of the competitors' products ('know thine enemy') as much as his/her own products and most importantly can provide an unbiased, devils-advocate kind of opinion when it comes to product strategies.
I have seen in well established product companies that the product marketing team is so soaked into their own products that they blindly evangelize it without working on the gaps.
The strategy team should also evaluate positioning, packaging, pricing the product innovatively and this should be backed by a lot, lot of research. Essentially this team is what will make the share holder value increase and if the guard from the hill top does not see the enemy marching through the terrains, the soldiers in the plains will see no attack coming.
Okay, what about the 'execution' lot? It's no inferior position to be in. Since the visionary team is not perfect, the execution team should have people who are pure managers as well as half-leaders and half-managers. The execution team will need pure managers to see if the product design is well done, if reviews are injected at every stage, if the risk is well managed at every stage, if ample automation is planned for building the software as well as for quality assurance. We need a balanced score card here too. Checklists will not work if they only track if a task was completed or not, if a review was completed or not instead they should track if a task or review was 'rightly' done or not.
And if there are hidden strategists within the execution team, they should be allowed free expression of opinions and should not be suppressed under multiple managerial levels. How often have we seen a brilliant idea or prototype by a junior executive, not well understood by his dinosaur manager.
So, essentially, the visionary team and execution team go hand in hand. They both are equally important and no one reserves to be within one team or the other purely by paper qualifications. It should be based upon the fire in the belly and on natural wiring of the individual.
Product excellence is a function of right vision and right execution.
Usually, there is a 'Vision' team and there is an 'Execution' team. Care should be taken that people with the right skills should be fitted within the right teams. For example, someone with very high qualifications from world's leading technical institutes may not necessarily be part of the vision team. It might be a worthwhile exercise to conduct an aptitude fitment test such as MBTI test. See who qualifies to part of the Vision/Strategy team. If we follow the balanced score card method of evaluation, the score table will point to those people who understand the industry dynamics, who have continuously tried out other products in the same league, understand the strengths and weaknesses of the competitors' products ('know thine enemy') as much as his/her own products and most importantly can provide an unbiased, devils-advocate kind of opinion when it comes to product strategies.
I have seen in well established product companies that the product marketing team is so soaked into their own products that they blindly evangelize it without working on the gaps.
The strategy team should also evaluate positioning, packaging, pricing the product innovatively and this should be backed by a lot, lot of research. Essentially this team is what will make the share holder value increase and if the guard from the hill top does not see the enemy marching through the terrains, the soldiers in the plains will see no attack coming.
Okay, what about the 'execution' lot? It's no inferior position to be in. Since the visionary team is not perfect, the execution team should have people who are pure managers as well as half-leaders and half-managers. The execution team will need pure managers to see if the product design is well done, if reviews are injected at every stage, if the risk is well managed at every stage, if ample automation is planned for building the software as well as for quality assurance. We need a balanced score card here too. Checklists will not work if they only track if a task was completed or not, if a review was completed or not instead they should track if a task or review was 'rightly' done or not.
And if there are hidden strategists within the execution team, they should be allowed free expression of opinions and should not be suppressed under multiple managerial levels. How often have we seen a brilliant idea or prototype by a junior executive, not well understood by his dinosaur manager.
So, essentially, the visionary team and execution team go hand in hand. They both are equally important and no one reserves to be within one team or the other purely by paper qualifications. It should be based upon the fire in the belly and on natural wiring of the individual.
Product excellence is a function of right vision and right execution.
Tuesday, October 24, 2006
I can't change my hair style! My biometrics will fail!
Alright, this statement is slightly exaggerated but only slightly. We might arrive at this soon.
If you are an avid internet surfer and a member of many web portals, just think about how many times you asked for 'Forgot Password'. It's hard on us. With all due respect to SSO, I need to memorize about 10-15 passwords and use a password manager (it's easier to crack the password manager utility than than password itself) and inspite of all this, I request for the forgotten password. Recently I read that 60-75% of the HelpDesk requests to famous portals are for the forgotten passwords.
Biometrics has made great inroads. Not that all laptops have that facility and it's specially not there with all the sales personnel in the world who travel with unsecured laptops. But if biometrics does become common place and wins the catch up game with the hackers, we might be in for some reprieve.
I'm particularly interested in how the face, finger and voice based biometrics work. Some of them have the multi layered verification system where there's registered finger print, the voice based pass phrase and then the password itself. However, as far as I have seen, you are again tied in or hard coded to the finger touch recognition software's provider.
With the face, it's trickier. There are supposedly technologies that will recognize the face even if the face dimensions change with a smile (don't smile ear to ear), frowning , blinking etc the risks are as much as that associated with regular image mapping techniques.
What if the face is posed at an angle of 15-20 degrees? What if the light is insufficient.
The most important thing to remember is the title of this blog - yes, face biometrics is yet to transcend the changes made by your hair dresser. It's quite possible that the world's leading hair dressers might register themselves in a face-provider registry and discovered by the web service out of face biometrics and your latest face will be sent in for verification.
How about using a string of regular passwords as in older times ;-)
If you are an avid internet surfer and a member of many web portals, just think about how many times you asked for 'Forgot Password'. It's hard on us. With all due respect to SSO, I need to memorize about 10-15 passwords and use a password manager (it's easier to crack the password manager utility than than password itself) and inspite of all this, I request for the forgotten password. Recently I read that 60-75% of the HelpDesk requests to famous portals are for the forgotten passwords.
Biometrics has made great inroads. Not that all laptops have that facility and it's specially not there with all the sales personnel in the world who travel with unsecured laptops. But if biometrics does become common place and wins the catch up game with the hackers, we might be in for some reprieve.
I'm particularly interested in how the face, finger and voice based biometrics work. Some of them have the multi layered verification system where there's registered finger print, the voice based pass phrase and then the password itself. However, as far as I have seen, you are again tied in or hard coded to the finger touch recognition software's provider.
With the face, it's trickier. There are supposedly technologies that will recognize the face even if the face dimensions change with a smile (don't smile ear to ear), frowning , blinking etc the risks are as much as that associated with regular image mapping techniques.
What if the face is posed at an angle of 15-20 degrees? What if the light is insufficient.
The most important thing to remember is the title of this blog - yes, face biometrics is yet to transcend the changes made by your hair dresser. It's quite possible that the world's leading hair dressers might register themselves in a face-provider registry and discovered by the web service out of face biometrics and your latest face will be sent in for verification.
How about using a string of regular passwords as in older times ;-)
Sunday, September 10, 2006
Can 'My PC' be presented to me same everywhere? Trends for Mobile Computing!
It's been funny - the whole change in my gadget setup from the last few months. I took a fancy to those laptop ads on a TV where a funky haired guy sits in a beach and laptops his way to work. I was obsessed about buying a light laptop, the lightest in fact, which I would connect through wireless connectivity from anywhere at home and work from that. It could be from the terrace staring at the moon and stars, or lying down on a heap of pillows with the laptop on me. Yeah, all this worked and how! Until I found this new piece of software (mobipocket) that gets installed on my mobile and does practically everything. Now my whole paradigm of working has changed. I use only my mobile to read news in the morning (downloaded as RSS feeds from the news site)- my family is very happy about this because we don't fight for the early morning newspaper anymore, read the latest technology, business and Scientific American latest posts on it. I also religiously download the technical documents I need to review as part of work - functional specs, technical specs, SOW, contracts ....what not in a mobile optimized format right on my mobile. I review these documents at night with reading lamps off, without disturing anyone....man, what a change! As you have guessed it, I also download fiction, non-fiction works for reading and I can read a whole book at night without switching on reading lamps and being a nuisance to others.
Mobile-ing of documents, mobile-ing of news, mobile-ing RSS aggregation has changed my life.
My wish list would be like this:
- Single Sign On from my mobile to a range of sites, to VPN of my office network
- A more reliable VoIP calling service that I can use for conference calls across the globe (my OS is Symbian and Skype yet doesn't work as expected although Google's gdskype? has something similar)
- Email profiles automatically maintained from my email provider. For.e.g. I want only certain emails to be delivered or I want all emails that meet a criteria to be made color=RED on server side, desktop-client side and mobile
- User profiling and 'My Documents' implemented for 'Me' across any device. For e.g. if a global SAN is maintained for me as an user with my documents (just like email - could be presonal, work, music etc) then I can access 'My Documents' from anywhere via any protocol.
- LDAP - Global address book with appropriate rules for categories and grouping rules
- Imaging or something like imaging. For e.g., if I'm at SFO airport trying to access my mails and documents, can my 'image' of my folder structure, my documents, my profile be automatically configured on the go so that I can work on my environment even from airport terminal.
- Hack proof methods
WDYT?
Mobile-ing of documents, mobile-ing of news, mobile-ing RSS aggregation has changed my life.
My wish list would be like this:
- Single Sign On from my mobile to a range of sites, to VPN of my office network
- A more reliable VoIP calling service that I can use for conference calls across the globe (my OS is Symbian and Skype yet doesn't work as expected although Google's gdskype? has something similar)
- Email profiles automatically maintained from my email provider. For.e.g. I want only certain emails to be delivered or I want all emails that meet a criteria to be made color=RED on server side, desktop-client side and mobile
- User profiling and 'My Documents' implemented for 'Me' across any device. For e.g. if a global SAN is maintained for me as an user with my documents (just like email - could be presonal, work, music etc) then I can access 'My Documents' from anywhere via any protocol.
- LDAP - Global address book with appropriate rules for categories and grouping rules
- Imaging or something like imaging. For e.g., if I'm at SFO airport trying to access my mails and documents, can my 'image' of my folder structure, my documents, my profile be automatically configured on the go so that I can work on my environment even from airport terminal.
- Hack proof methods
WDYT?
Saturday, August 05, 2006
Data versus Information
I was recently searching for foreign exchange rates on a particular date for the purpose of filing my income tax returns. I searched Google and other search engines for 'currency rate', 'exchange rate', 'currency converter','foreign exchange rate' etc and tried to use all the usual terminologies associated with it. I could not get the result I wanted. I was a dissatisfied customer of search engines. Then, I did the linear search - going through the forex sites, Reserve bank site etc one by one and bingo! I found Reserve Bank's site with exchange rates for every day for the last few years. And I noticed that it was tagged as 'Reference Rate'!
Do you get my problem, rather a search enthusiast's problem? While search is based on the keyword, the keyword itself may vary depending on the searcher's profile such as regional parlance, educational levels, professional levels etc.
While a layman would search for 'salary difference', a HR pro might search for 'salary gap analysis'. Layman- 'How to get modem working', IT pro -'IP configuration'.
So, jst like there's a difference beween what you seek and what yu want, there is a difference between Data versus Information. Data is raw while Information is contextual (apart from being analytical).
While providing information from raw data, it is a good idea to look at following thumb rules:
- User profile, what kind of user is asking for this.
- User Topology and demographic information
- Automatic learning by the information classification system. e.g. collections or business rules should be dynamically changed as the system learns more and more from the information
- The learning itself is automated by a good business intelligence and analytics tool
- User experience management - this is a big discipline. In short, monitor how promotions were effective, how good merchandise interfaces are being used, how relevant offers are based on user identity etc.
- User context is always maintained and continously evolved. One example for this could be that if a user is found to search for certain terms in the legal profession and the same user is searching for certain terms in Advanced Robotics, then the system can 'learn' that the user is a well read person. Similarly if a user is always typing with spelling errors and searching for typical teenage non-intellectual subjects, the user can be put into a collection of casual not very well informed users.
- Similarly Reports that contain information should be based according to context.
It's probably the next 'in' thing within Search and other information retrieval systems for laymen.
Do you get my problem, rather a search enthusiast's problem? While search is based on the keyword, the keyword itself may vary depending on the searcher's profile such as regional parlance, educational levels, professional levels etc.
While a layman would search for 'salary difference', a HR pro might search for 'salary gap analysis'. Layman- 'How to get modem working', IT pro -'IP configuration'.
So, jst like there's a difference beween what you seek and what yu want, there is a difference between Data versus Information. Data is raw while Information is contextual (apart from being analytical).
While providing information from raw data, it is a good idea to look at following thumb rules:
- User profile, what kind of user is asking for this.
- User Topology and demographic information
- Automatic learning by the information classification system. e.g. collections or business rules should be dynamically changed as the system learns more and more from the information
- The learning itself is automated by a good business intelligence and analytics tool
- User experience management - this is a big discipline. In short, monitor how promotions were effective, how good merchandise interfaces are being used, how relevant offers are based on user identity etc.
- User context is always maintained and continously evolved. One example for this could be that if a user is found to search for certain terms in the legal profession and the same user is searching for certain terms in Advanced Robotics, then the system can 'learn' that the user is a well read person. Similarly if a user is always typing with spelling errors and searching for typical teenage non-intellectual subjects, the user can be put into a collection of casual not very well informed users.
- Similarly Reports that contain information should be based according to context.
It's probably the next 'in' thing within Search and other information retrieval systems for laymen.
Saturday, May 13, 2006
The insured testing professional
Have you ever interviewed 'Testing professionals' whose resume has everything under the sun under testing? While there are exceptions, I have most often seen people who appear for a testing job unprepared for a technical interview. They could answer any number of 'process' or theoritical problems such as what is a black box testing, or integration testing, regression testing etc but when you drill deeper into solving a day to day problem they cannot respond very well.
I'm wondering if it is a chicken and egg problem. The process of quality assurance is not the first in the software development cycle. A test spec often always follows a requirements or functional spec and at that very stage, the tester transforms to someone who is supposed to follow or validate what the developer thought of. Is the test spec always limited by the functional spec's scope? Are we creating/developing testers who just need to validate and need not really invest in understanding the latest technology trends, the latest product vulnerabilities?
Let me give couple of examples.
Ask a question about how would the interviewee test a simple web application that submits a form with two simple fields 'name' and 'address' to a repository at server. More often than not, 90% and above, you will get the first response as UI testing. 'Validate the input fields', 'check for input length' 'check for special characters'.... well, any tester is supposed to have already known the kinder garten stuff of testing. Okay, dwelve a little deeper, ask him/her that you want the interviewee to provide functional examples and bang will come the reply 'check the database and see if the input entered via form is updated properly'. Ask him/her 'do you think data can be entered by other means and you should ensure testing for those conditions', again bang comes the reply 'well, the form should have strict validation and from the server side the database administrator should take precautions'.
I'm yet to see testers talk about vulnerabilities in authentication, input filtering, SQL injection, transport security, error handling etc.
I hope I'm proved wrong. I think 90% of testers are people who do what they are asked to do and the functional spec is an insurance for them against customer defects or hacker attacks.
Check it out - ask an interviewee next time 'How can my Forgot Password feature be exploited'?
I'm wondering if it is a chicken and egg problem. The process of quality assurance is not the first in the software development cycle. A test spec often always follows a requirements or functional spec and at that very stage, the tester transforms to someone who is supposed to follow or validate what the developer thought of. Is the test spec always limited by the functional spec's scope? Are we creating/developing testers who just need to validate and need not really invest in understanding the latest technology trends, the latest product vulnerabilities?
Let me give couple of examples.
Ask a question about how would the interviewee test a simple web application that submits a form with two simple fields 'name' and 'address' to a repository at server. More often than not, 90% and above, you will get the first response as UI testing. 'Validate the input fields', 'check for input length' 'check for special characters'.... well, any tester is supposed to have already known the kinder garten stuff of testing. Okay, dwelve a little deeper, ask him/her that you want the interviewee to provide functional examples and bang will come the reply 'check the database and see if the input entered via form is updated properly'. Ask him/her 'do you think data can be entered by other means and you should ensure testing for those conditions', again bang comes the reply 'well, the form should have strict validation and from the server side the database administrator should take precautions'.
I'm yet to see testers talk about vulnerabilities in authentication, input filtering, SQL injection, transport security, error handling etc.
I hope I'm proved wrong. I think 90% of testers are people who do what they are asked to do and the functional spec is an insurance for them against customer defects or hacker attacks.
Check it out - ask an interviewee next time 'How can my Forgot Password feature be exploited'?
Saturday, April 08, 2006
False Positive Emails
Today I dug out a very important email from my mailbox from the SPAM folder. As I had guessed it, it was 'False Positive Filtering'.
In the simplest terms, False +ve emails are those which should have hit your inbox but the spammer algo with your email provider thought it was a spam. So, how does this work? Usually most sites such as Google, Yahoo etc keep their spammer rules very secretive for security reasons.
Common possibilities are these:
- When the email's HTML contains links to images with names remotely suggestive of porn stuff
- Many spam emails ask user to explicitly click on link 'remove me from this list' only to trap their email and domain names to send more spam. So, sometimes, even genuine emails with 'remove me....' can be considered spam
- If the sender's email domain address does not match the 'From' string. For e.g. if the mail says From: Income Tax Dept' and domain is 'gxbvghh.com' . Sometimes genuine emails, which are sent as mass mails using a 3rd party provider can get flagged as positive.
- Suggestive attachment names
Interestingly there is a metric to measure email deliverability and one of them is to reduce false positive emails. I believe Google has it highest at around 99%.
So, look for those missing emails in your bulk/spam folder.
In the simplest terms, False +ve emails are those which should have hit your inbox but the spammer algo with your email provider thought it was a spam. So, how does this work? Usually most sites such as Google, Yahoo etc keep their spammer rules very secretive for security reasons.
Common possibilities are these:
- When the email's HTML contains links to images with names remotely suggestive of porn stuff
- Many spam emails ask user to explicitly click on link 'remove me from this list' only to trap their email and domain names to send more spam. So, sometimes, even genuine emails with 'remove me....' can be considered spam
- If the sender's email domain address does not match the 'From' string. For e.g. if the mail says From: Income Tax Dept' and domain is 'gxbvghh.com' . Sometimes genuine emails, which are sent as mass mails using a 3rd party provider can get flagged as positive.
- Suggestive attachment names
Interestingly there is a metric to measure email deliverability and one of them is to reduce false positive emails. I believe Google has it highest at around 99%.
So, look for those missing emails in your bulk/spam folder.
Friday, March 24, 2006
Software as a service (SaaS)
This is the newest coinage of the term which is making a new wave now with big companies like IBM hosting exclusive seminars for it. Idea-wise and technically there is nothing new here. It is simply the culmination of the software evolving from a provider providing one-size-fits-all product to a more collaborative connective world where each provider provides a solution and everything falls nicely into place with each other. 'Live and let live'.
The sops offered are for CTOs and CIOs. Lesser maintenance headaches, no need to go up on the application upgrade treadmill, not to bother about compatibility and adaptability and in some ways, the best of breeds put together.
Athough ASPs and On-demand hosting are cousins to SaaS, theer are subtle differences. SaaS is loosely based on SOA where you can have services (a la products) talk to each other at different levels. It could be at raw API level, architecture level or application level. All the usual demons of integration has to be planned in advance and addressed such as security, service policies (which by itself can be a service), accounting and autherization, platform dependencies...etc.
SaaS modelled application can be hosted or could be on-premise. The licensing and pricing models and rules would be different from what we have today.
Web developers today are still not trained on technologies such as BPEL to readily develop SaaS based services. Most developers would be happy to integrate services/apps using plain old XML and HTTP. And again, most old software cannot be thrown away for want of SaaS. So, the trend would be to develop wrappers around them and 'somehow' make them a service that others can discover, register and use.
I saw a cynic's post calling SaaS as 'Same old software as service'. For those of us who know Hindi, SaaS means mom-in-law and I guess this is closest to it's meaning. 'Put up with it'!Software as a service (SaaS)
The sops offered are for CTOs and CIOs. Lesser maintenance headaches, no need to go up on the application upgrade treadmill, not to bother about compatibility and adaptability and in some ways, the best of breeds put together.
Athough ASPs and On-demand hosting are cousins to SaaS, theer are subtle differences. SaaS is loosely based on SOA where you can have services (a la products) talk to each other at different levels. It could be at raw API level, architecture level or application level. All the usual demons of integration has to be planned in advance and addressed such as security, service policies (which by itself can be a service), accounting and autherization, platform dependencies...etc.
SaaS modelled application can be hosted or could be on-premise. The licensing and pricing models and rules would be different from what we have today.
Web developers today are still not trained on technologies such as BPEL to readily develop SaaS based services. Most developers would be happy to integrate services/apps using plain old XML and HTTP. And again, most old software cannot be thrown away for want of SaaS. So, the trend would be to develop wrappers around them and 'somehow' make them a service that others can discover, register and use.
I saw a cynic's post calling SaaS as 'Same old software as service'. For those of us who know Hindi, SaaS means mom-in-law and I guess this is closest to it's meaning. 'Put up with it'!Software as a service (SaaS)
Wednesday, March 01, 2006
8 terabyte desktop? For whom?
I just read this piece of news: 8 Terabytes Desktop
It is like you can never have too much of the goodies! With such a large disk space on my desktop, I can have 'anything' I want stored on that. One of the first obvious advantages is that I can store videos, movies, arhcive literally everything and will 'never need to delete but just sort' - a la gmail.
But, think about it. This might be great for a student or a graphic designer who would need to store loads of data who works from one single place. But what about the regular IT pro? An IT pro's profile is like this. He/She travels from home to office and back everyday and sometimes to different offices within the country or outside his/her home countey. And sometimes to other destinations on travel. So, how would he/she efficiently use the TBs of info which is situated on his desktop at one office? Will this massive data storage be augmented by massive network bandwidths, by massive backup and recovery strategies?
To me, an ideal environment would be like this. I would stash away the laptop and any dependency with the hard disk out there. I would log on to the internet - from home, office, airport, from my car (car can have a light weight network computer), from my hotel and probably from general travel lounges across the world without being bothered about disk space, backup, RAID and security.
To me, internet bandwidth and availability are much more useful than terabytes of storage on a PC.
It is like you can never have too much of the goodies! With such a large disk space on my desktop, I can have 'anything' I want stored on that. One of the first obvious advantages is that I can store videos, movies, arhcive literally everything and will 'never need to delete but just sort' - a la gmail.
But, think about it. This might be great for a student or a graphic designer who would need to store loads of data who works from one single place. But what about the regular IT pro? An IT pro's profile is like this. He/She travels from home to office and back everyday and sometimes to different offices within the country or outside his/her home countey. And sometimes to other destinations on travel. So, how would he/she efficiently use the TBs of info which is situated on his desktop at one office? Will this massive data storage be augmented by massive network bandwidths, by massive backup and recovery strategies?
To me, an ideal environment would be like this. I would stash away the laptop and any dependency with the hard disk out there. I would log on to the internet - from home, office, airport, from my car (car can have a light weight network computer), from my hotel and probably from general travel lounges across the world without being bothered about disk space, backup, RAID and security.
To me, internet bandwidth and availability are much more useful than terabytes of storage on a PC.
Thursday, February 16, 2006
Economies of Data Integration
What are the real issues or 'real time' issues of data integration? While we see everyday ads or
claims from companies about how their software can integrate data, there are still a lot of nitty
gritties that have to carefully considered while making a 'one strategy for all' decision. One of
my favourite quotes is this: 100% automation leads to ineffectiveness.
All organizations, from the smallest of SMBs to the largest enterprise have silos of data, silos
of information across finance, IT Operations, various customer or client data repositories,
internal repositories, internal workflow systems etc. Unless an enterprise is ideally designed,
most of them would have parallel streams of workflow. One stream would be the integration between
internal operations, finance, costing, auditing, compliance divisions. These are largely
Microsoft Excel based, Tally based or based on similar tools. How many of the currently available
Accounting softwares, Network monitoring softwares or Compliance checkers are interoperable with
each other and with enterprise data flow management? The internal Ops data is also highly
sensitive and protected and the typical finance manager does not know or does not believe in SSO
kind of things and he only knows authentication established locally. In 80% of the companies
according to a research site on macro economics, the most sensitive financial data is still
resting on the hard drive of the chief finance person's laptop which is ofcourse well backed up.
The second stream that can be integrated is the inhouse knowledge and information management,
including competence building, training, market gap analysis well tuned with product analysis,
gap based skills acquisition. This stream data is not as hard to integrate as the first stream
but nevertheless difficult. If each division such as Corporate Training, Knowledge Management, IT
Management, Program Management etc have different products controlling them (which is usually the
case) as varied as Microsoft spreadsheets, Siebel based systems, SAP based systems, Tivoli based
data - well, I can hear SOAists screaming, but we still have a long way to go on that.
The least difficult of the data integration streams is the enterprise level data and information
integration. We all know that all major enterprise players have developed various adapters and
stand alone products which will recognize and discover services, understand business rules,
transform business data according to that and 'talk to each other'.
I'm trying to gather information about how well information integration itself is working out. Readers, please pass on any web sites that publish information about the economics of
service-to-service adoption and how do their returns measure up against cost in clear
quantifiable terms.
claims from companies about how their software can integrate data, there are still a lot of nitty
gritties that have to carefully considered while making a 'one strategy for all' decision. One of
my favourite quotes is this: 100% automation leads to ineffectiveness.
All organizations, from the smallest of SMBs to the largest enterprise have silos of data, silos
of information across finance, IT Operations, various customer or client data repositories,
internal repositories, internal workflow systems etc. Unless an enterprise is ideally designed,
most of them would have parallel streams of workflow. One stream would be the integration between
internal operations, finance, costing, auditing, compliance divisions. These are largely
Microsoft Excel based, Tally based or based on similar tools. How many of the currently available
Accounting softwares, Network monitoring softwares or Compliance checkers are interoperable with
each other and with enterprise data flow management? The internal Ops data is also highly
sensitive and protected and the typical finance manager does not know or does not believe in SSO
kind of things and he only knows authentication established locally. In 80% of the companies
according to a research site on macro economics, the most sensitive financial data is still
resting on the hard drive of the chief finance person's laptop which is ofcourse well backed up.
The second stream that can be integrated is the inhouse knowledge and information management,
including competence building, training, market gap analysis well tuned with product analysis,
gap based skills acquisition. This stream data is not as hard to integrate as the first stream
but nevertheless difficult. If each division such as Corporate Training, Knowledge Management, IT
Management, Program Management etc have different products controlling them (which is usually the
case) as varied as Microsoft spreadsheets, Siebel based systems, SAP based systems, Tivoli based
data - well, I can hear SOAists screaming, but we still have a long way to go on that.
The least difficult of the data integration streams is the enterprise level data and information
integration. We all know that all major enterprise players have developed various adapters and
stand alone products which will recognize and discover services, understand business rules,
transform business data according to that and 'talk to each other'.
I'm trying to gather information about how well information integration itself is working out. Readers, please pass on any web sites that publish information about the economics of
service-to-service adoption and how do their returns measure up against cost in clear
quantifiable terms.
Friday, January 06, 2006
My dear Biz portal- get noticed!
After crossing all the hurdles of creating a business portal and effectively creating it's functional aspects, comes the real hurdle - which is to ensure that THIS particular portal is what customer's will find when they look for one. Easier said than seen!
Broadly, this means the customer may be directed to the business portal via advertisement banners, reference links, pay-per-click programmes etc. But the largest possibility is for the customer to reach via a search engine. Again, not all search engines. According to a study, customers find a business site 35% of the time using Google, 30% of the time using Yahoo, about 15% of the time using AOL or MSN search engines and occasionally using the likes of Lycos or AskJeeves etc. While I'm not professing these claims, these figures do indicate what we know commonly about. So, it boiles down to essentially being spotted by Google, Yahoo and perhaps MSN.
So, the second requirement is that it is not enough to be just listed (or discovered) by these search engines. Most users behave like this(again, a study based data), 75% of the time the user quits or changes search criteria once he/she doesn't find relevant information in the first page. This translates to about ten to 15 results at the maximum assuming user is using a full sized window.
So then how do you get to the top of the search results page? It's not a staright forward formula. The parameters are varied and the web site constantly needs to optimize itself inorder to be found by these kingpin search engines.
Basic optimization is to fine tune primary key words, phrases and descriptions.
The next step is to look for the search engine's ranking and it's directory positioning. This will help the web site designer to optimize it for the most suited search engine.
The next step is to select the best pages or right pages and mark the content for theme-based indexing.
Look for those keywords, phrases etc. that may be potentially marked by search cops (spiders) as spams although it may be unintentional.
Make keyword optimized directory submissions so that webservices can discover the right services in a jiffy.
Once these basic steps are done, the 'get noticed' result is achieved. However this is by no means the end of web designing because a lot goes into Traffic planning, the site mapping and navigation architecture, image structure, site structure etc.
Site hit report and analysis is a key market intelligence area.
Broadly, this means the customer may be directed to the business portal via advertisement banners, reference links, pay-per-click programmes etc. But the largest possibility is for the customer to reach via a search engine. Again, not all search engines. According to a study, customers find a business site 35% of the time using Google, 30% of the time using Yahoo, about 15% of the time using AOL or MSN search engines and occasionally using the likes of Lycos or AskJeeves etc. While I'm not professing these claims, these figures do indicate what we know commonly about. So, it boiles down to essentially being spotted by Google, Yahoo and perhaps MSN.
So, the second requirement is that it is not enough to be just listed (or discovered) by these search engines. Most users behave like this(again, a study based data), 75% of the time the user quits or changes search criteria once he/she doesn't find relevant information in the first page. This translates to about ten to 15 results at the maximum assuming user is using a full sized window.
So then how do you get to the top of the search results page? It's not a staright forward formula. The parameters are varied and the web site constantly needs to optimize itself inorder to be found by these kingpin search engines.
Basic optimization is to fine tune primary key words, phrases and descriptions.
The next step is to look for the search engine's ranking and it's directory positioning. This will help the web site designer to optimize it for the most suited search engine.
The next step is to select the best pages or right pages and mark the content for theme-based indexing.
Look for those keywords, phrases etc. that may be potentially marked by search cops (spiders) as spams although it may be unintentional.
Make keyword optimized directory submissions so that webservices can discover the right services in a jiffy.
Once these basic steps are done, the 'get noticed' result is achieved. However this is by no means the end of web designing because a lot goes into Traffic planning, the site mapping and navigation architecture, image structure, site structure etc.
Site hit report and analysis is a key market intelligence area.
Wednesday, December 21, 2005
Information at any layer can be used for business
Today, I was surprised to see the local courier boy dropping off specific notice pamphlets to targeted post boxes. I did my little bit of investigation and found that he had some notices printed for a day care center. Essentially an ad and he was carefully dropping it to those post boxes that were owned by families with kids and toddlers.
I was happy to know several things. One, the day care sponsor had used an effective service to do the advertising. Second, there was no wastage in the service. It reached the right people. Third, it was using a layer inbetween the manufacturer and distributor for advertising.
It was like the Feedster model you know. Earlier we googled web sites, then we were provided with search facilities for news sites, images, printed books ...etc etc. But services like Feedster searched the feeds and that's like searching the postman for the right information while he carries loads of important mails meant for all. The revenue model is based on sponsored searches and advertisement programs based on successful usage of feeds. Feeds, not web sites will be the next order!
There, I get a pop-up asking if I want to use 'Ad-Sense' program and make money while you read this ..:-D
Merry christmas and Happy New Year.
I was happy to know several things. One, the day care sponsor had used an effective service to do the advertising. Second, there was no wastage in the service. It reached the right people. Third, it was using a layer inbetween the manufacturer and distributor for advertising.
It was like the Feedster model you know. Earlier we googled web sites, then we were provided with search facilities for news sites, images, printed books ...etc etc. But services like Feedster searched the feeds and that's like searching the postman for the right information while he carries loads of important mails meant for all. The revenue model is based on sponsored searches and advertisement programs based on successful usage of feeds. Feeds, not web sites will be the next order!
There, I get a pop-up asking if I want to use 'Ad-Sense' program and make money while you read this ..:-D
Merry christmas and Happy New Year.
Wednesday, December 07, 2005
Eyes on the Prize?
I read a good article on Business Week 'Eyes on the Prize'. The author says that he "instituted a well-designed bonus program in 2004, tying employees' pay directly to their performance and to the company's profitability". This is a fantastic method and in the age of capitalism, it works great.
I have always advocated performance based reward systems. But they are easier said than done. First of all, the organization should have an effective measuring and evaluation system. This cannot always be based on a formula. For number centric or quantifiable target centric organizations, it might be a shade easier but for a global organization it becomes difficult. If awards are a direct function of performance, then performance should be also highly visible and individualistic. So, in this system it is great to award a sales manager who has bagged successful accounts for a target $ sum. But we also need equally effective systems to measure and evaluate the quiet yet effective guy, 'the behind the scene man'.
I have seen this working both ways in organizations. For effective evaluations, there should be 360 degree feedback that includes direct management, peer group, influence groups, operations group and maybe even the support staff. An employee is a part of the organization first and foremost. When there is a wide array of feedback, all facets of the employee comes to the picture and any single function or individual cannot overly influence the judgement. But this evaluation will be somewhat abstract and empherical and cannot be converted to a mathematical formula.
How often has one seen the nice and quiet yet effective guy get great performance reviews? In a 1000+ strong organization? If this indeed happens, this can be a fair indicator that performance based reviews and award systems are indeed working.
The other side of the story is this. Most large organizations have numbers skewed towards the top management not only because they are more valuable to the organizations but also there is a thinking somewhere 'oh, well, I cannot rate this top guy low now. If I do, someone is going to ask me why I didn't I point at his under performance before. I didn't evaluate at all, so I better give him a good rating for performance'.
So, performance and values flow top to bottom and that should be monitored. What gets measured gets done!
I have always advocated performance based reward systems. But they are easier said than done. First of all, the organization should have an effective measuring and evaluation system. This cannot always be based on a formula. For number centric or quantifiable target centric organizations, it might be a shade easier but for a global organization it becomes difficult. If awards are a direct function of performance, then performance should be also highly visible and individualistic. So, in this system it is great to award a sales manager who has bagged successful accounts for a target $ sum. But we also need equally effective systems to measure and evaluate the quiet yet effective guy, 'the behind the scene man'.
I have seen this working both ways in organizations. For effective evaluations, there should be 360 degree feedback that includes direct management, peer group, influence groups, operations group and maybe even the support staff. An employee is a part of the organization first and foremost. When there is a wide array of feedback, all facets of the employee comes to the picture and any single function or individual cannot overly influence the judgement. But this evaluation will be somewhat abstract and empherical and cannot be converted to a mathematical formula.
How often has one seen the nice and quiet yet effective guy get great performance reviews? In a 1000+ strong organization? If this indeed happens, this can be a fair indicator that performance based reviews and award systems are indeed working.
The other side of the story is this. Most large organizations have numbers skewed towards the top management not only because they are more valuable to the organizations but also there is a thinking somewhere 'oh, well, I cannot rate this top guy low now. If I do, someone is going to ask me why I didn't I point at his under performance before. I didn't evaluate at all, so I better give him a good rating for performance'.
So, performance and values flow top to bottom and that should be monitored. What gets measured gets done!
Monday, December 05, 2005
Simple yet Secure login (albeit SSO)
Today I was attempting to convey my needs/requirements for an application that will essentially capture a software release oriented details in an incremental fashion. For eg. what percentage of new features are really requested by customers as enhancements and what percentage of new features are influenced by competitor product or both and the cost of staffing for the same. I had to source data from twenty different managers, from thirty different applications from a very heterogenous background and I needed a simple yet secure way for information to be entered.
I began my design for a good single sign on system. Industry has so many providers, including those SAML specific open source solutions. But what would influence my purchase of a good secure single sign on system?
Will my secure authentication (rather THAT one login and password) work across the legacy systems of accounting, financials, training-competence skills repositories? I understand there are 'connectors' to all these kind of systems based on .NET, Cobol, C, Windows, Mainframe, Visual Basic etc. Will these connectors connect and be the single gateway to get into all these systems? Is security inbuilt into the system which will check for multi-access such as accessing the database via backdoor using SQL script when a robust SSO sits waiting for users to authenticate?
If some of my data sourcing applications are upgraded, will my security gaurd still be able to work without a recheck and a cold failover? If I add a few more data sources, then again can they be 'hot pluggable'?
It is possible that legacy systems were not coded with secure coding practices - for example exposing possible access information as external parameters, URL parameters, hardcoded strings dumped in log files etc. Can my SSO software detect, poll and find out for me? In essence I'm asking not just for a security guard but a CIA advanced agent who will also do security guard duty for me? Too much? Well, there is another popular term for 'you are asking for too much' and that is 'out of the box'.
Has the software been tested with scaled users? How's performance when 500 users login at the same time? I have seen numerous industry specific benchmarks but you rarely get that kind of performance when you deploy it. This is much like an automobile's mileage under 'test' condition!
Finally, do SSO deployments handle authentication such as identity cards with the same robustness as pure login authentication. No, no, forget biometrics for now. I want simple yet fully secure systems.
I began my design for a good single sign on system. Industry has so many providers, including those SAML specific open source solutions. But what would influence my purchase of a good secure single sign on system?
Will my secure authentication (rather THAT one login and password) work across the legacy systems of accounting, financials, training-competence skills repositories? I understand there are 'connectors' to all these kind of systems based on .NET, Cobol, C, Windows, Mainframe, Visual Basic etc. Will these connectors connect and be the single gateway to get into all these systems? Is security inbuilt into the system which will check for multi-access such as accessing the database via backdoor using SQL script when a robust SSO sits waiting for users to authenticate?
If some of my data sourcing applications are upgraded, will my security gaurd still be able to work without a recheck and a cold failover? If I add a few more data sources, then again can they be 'hot pluggable'?
It is possible that legacy systems were not coded with secure coding practices - for example exposing possible access information as external parameters, URL parameters, hardcoded strings dumped in log files etc. Can my SSO software detect, poll and find out for me? In essence I'm asking not just for a security guard but a CIA advanced agent who will also do security guard duty for me? Too much? Well, there is another popular term for 'you are asking for too much' and that is 'out of the box'.
Has the software been tested with scaled users? How's performance when 500 users login at the same time? I have seen numerous industry specific benchmarks but you rarely get that kind of performance when you deploy it. This is much like an automobile's mileage under 'test' condition!
Finally, do SSO deployments handle authentication such as identity cards with the same robustness as pure login authentication. No, no, forget biometrics for now. I want simple yet fully secure systems.
Thursday, November 24, 2005
I Tech, therefore I am
Gadgets- can they get better, soon?
I recently read an article about how gadgets (read technology) have changed our lives and
made everything easy. But the cynic in me doesn't completely agree. The dreamer in me dreams of the following to name a few:
I want to throw away my cell phone - can't my watch, a small sized watch do the same functions?
Wait, I know of bulky watches which also function as a PDA, mobile...etc but thats not what
I want. I want a completely voice-interfaced phone. So, if I have to make a phone call, I
just whisper to my watch 'Call John'. Ofcourse John's number was earlier stored in my phone
using voice again with a command whispered to it such as 'Store John Mathew five five
six.....two two'. How do I listen and speak to a person using the watch - I will lift my
wrist close to my ears and mouth. Don't I do this with my mobile in any case. I don't need a
blue tooth attachment to my watch, remember I'm talking about making life easier with
gadgets not chaining my whole body with gadgets. Every other PDA function can be
voice-commanded. The watch can have a small adapter, almost invisible, which I can plug in
to a computer once in a while to get reports, lists, schedules etc printed. I'm ofcourse
thinking of only the able persons and not about American Disabilities Act section 508.
Finger printing has been talked about as a pretty big thing in industry but the scale is
really small yet and if that does take off, I can just have my finger and get away with ID
card to my work place, the keys to my house, keys to my office. If I extend this fantasy,
some day maybe next century, there would be no passports, no logins, no bank account numbers to remember or jot down. The possibilities are stupendous. And I do know that someone can
cut my finger off and have everything that's mine but everything really has a risk. I can
book air tickets, draw a bank draft, sign an online cheque, pay bills ...all at the drop of
a hat,rather drop of a finger.
The laptop - the industry has toggled on and off the network enabled slim PC (rather dumb
terminal that just acts as an interface to all-knowing all-powerful remote server). The
laptop is a pain to carry, a pain to recharge and a pain to to be connected to Wi-Fi terminals esp. if a VPN network has to work through it and a whole big bunch of collaborative tools depending on it. For example while I'm on a meeting, sitting in an airport (aha- isn't this the ideal usecase for a collaborative tool), drawing my plan on a net meeting like white board, I suddenly find that my wi-fi connection got disconnected for 0.5 seconds. A tier-1 user who uses the internet network will not observe this since the network immediately reconnects in a transparent fashion. But see, I'm a heavy tier-2 or tier-3 user. My VPN firewall, very secutiy conscious goes off with the smallest change in network, and my whole lot of Workplace tools (my drawing on the white board, my messages, my control of another's desktop is all gone with the wind! I have to negotiate my VPN again, reconnect to the meeting and do everything again. Now tell me, this is not a remote possibility, or is it?
Most ads for networks, tools, grid or utility computing, storage management, journaling of lost work,backup, disaster recovery etc make it look very simple on TV and magazines. They really are not.
The list can go on...I will stop here. Thanks for reading.
I recently read an article about how gadgets (read technology) have changed our lives and
made everything easy. But the cynic in me doesn't completely agree. The dreamer in me dreams of the following to name a few:
I want to throw away my cell phone - can't my watch, a small sized watch do the same functions?
Wait, I know of bulky watches which also function as a PDA, mobile...etc but thats not what
I want. I want a completely voice-interfaced phone. So, if I have to make a phone call, I
just whisper to my watch 'Call John'. Ofcourse John's number was earlier stored in my phone
using voice again with a command whispered to it such as 'Store John Mathew five five
six.....two two'. How do I listen and speak to a person using the watch - I will lift my
wrist close to my ears and mouth. Don't I do this with my mobile in any case. I don't need a
blue tooth attachment to my watch, remember I'm talking about making life easier with
gadgets not chaining my whole body with gadgets. Every other PDA function can be
voice-commanded. The watch can have a small adapter, almost invisible, which I can plug in
to a computer once in a while to get reports, lists, schedules etc printed. I'm ofcourse
thinking of only the able persons and not about American Disabilities Act section 508.
Finger printing has been talked about as a pretty big thing in industry but the scale is
really small yet and if that does take off, I can just have my finger and get away with ID
card to my work place, the keys to my house, keys to my office. If I extend this fantasy,
some day maybe next century, there would be no passports, no logins, no bank account numbers to remember or jot down. The possibilities are stupendous. And I do know that someone can
cut my finger off and have everything that's mine but everything really has a risk. I can
book air tickets, draw a bank draft, sign an online cheque, pay bills ...all at the drop of
a hat,rather drop of a finger.
The laptop - the industry has toggled on and off the network enabled slim PC (rather dumb
terminal that just acts as an interface to all-knowing all-powerful remote server). The
laptop is a pain to carry, a pain to recharge and a pain to to be connected to Wi-Fi terminals esp. if a VPN network has to work through it and a whole big bunch of collaborative tools depending on it. For example while I'm on a meeting, sitting in an airport (aha- isn't this the ideal usecase for a collaborative tool), drawing my plan on a net meeting like white board, I suddenly find that my wi-fi connection got disconnected for 0.5 seconds. A tier-1 user who uses the internet network will not observe this since the network immediately reconnects in a transparent fashion. But see, I'm a heavy tier-2 or tier-3 user. My VPN firewall, very secutiy conscious goes off with the smallest change in network, and my whole lot of Workplace tools (my drawing on the white board, my messages, my control of another's desktop is all gone with the wind! I have to negotiate my VPN again, reconnect to the meeting and do everything again. Now tell me, this is not a remote possibility, or is it?
Most ads for networks, tools, grid or utility computing, storage management, journaling of lost work,backup, disaster recovery etc make it look very simple on TV and magazines. They really are not.
The list can go on...I will stop here. Thanks for reading.
Friday, November 18, 2005
What should a SOA enterprise server do for me
Friday, November 18, 2005
What should a SOA enterprise server do for me
While SOA as a technology may not be mature enough, and primitive still at adoption levels, the concept behind it is promising. As I understand SOA better and better, I begin to have more and more expectations.
Let us talk about 'pulling'/'pushing' data . Data or information on a case by case basis may exist in various forms in heteregenous applications. I can get my data stream via messaging queues such as railways ticket bookings, online lottery engagements, the most often used example being stock and forex quotes.
Or I could have my information coming through SMTP systems (our dear email in Simple Message Transfer Protocol) which according to me is really a content managed workflow system that has been time tested and working very reliably.
Or from the traditional RDBMS APIs or database webservices which follow open standards of SOAP and WSDL.
Or via real time feeds such as a patient's heart beat, pulse, BP, lung saturation etc from the monitors in an Intensive Care Unit in a hospital on to the doctor's real time monitoring system
Or via Applications and information from enterprise systems such as SAP or Siebel based applications where data is got just from SAP(abstracted to the app level and not about APIs or depper layers of data storage and persistence).
Or from the hugely invested silos of information available in mainframes such as CISC where data sometimes need to be just got from screens and not from data layers.
And when I do successfully get the continous data out of these sources, I should have the facility for me to define how different types of data are to be validated (for e.g. the doctor should be able to say, 'Attend to the patient, don't validate,if it is an emergency without bothering to register him/her to the hospital repository), to be computed (for e.g. no taxation for income levels lesser than a certain amount), to be transformed ('Get all those long binary strings and convert them to giga object types).
I should also be able to inherit validation, transformation or security rules because they have all been already defined and running in all my heterogenous systems for years!
I understand that 80% of SOA effort is towards Exception Handling. That's logical because SOA itself requires an integration layer inbetween the model and presentation layer and due to the variety of different interfaces and due to their complexities the exceptions can be plenty. But they all need to be handled as beautifully they were handled in their original stand alone systems earlier if not better.
The whole thing looks very promising but as I sit down with the requirements, I'm bogged down because it looks complex. User experiences will dictate a lot of design in this area and the end-user is the King!
What should a SOA enterprise server do for me
While SOA as a technology may not be mature enough, and primitive still at adoption levels, the concept behind it is promising. As I understand SOA better and better, I begin to have more and more expectations.
Let us talk about 'pulling'/'pushing' data . Data or information on a case by case basis may exist in various forms in heteregenous applications. I can get my data stream via messaging queues such as railways ticket bookings, online lottery engagements, the most often used example being stock and forex quotes.
Or I could have my information coming through SMTP systems (our dear email in Simple Message Transfer Protocol) which according to me is really a content managed workflow system that has been time tested and working very reliably.
Or from the traditional RDBMS APIs or database webservices which follow open standards of SOAP and WSDL.
Or via real time feeds such as a patient's heart beat, pulse, BP, lung saturation etc from the monitors in an Intensive Care Unit in a hospital on to the doctor's real time monitoring system
Or via Applications and information from enterprise systems such as SAP or Siebel based applications where data is got just from SAP(abstracted to the app level and not about APIs or depper layers of data storage and persistence).
Or from the hugely invested silos of information available in mainframes such as CISC where data sometimes need to be just got from screens and not from data layers.
And when I do successfully get the continous data out of these sources, I should have the facility for me to define how different types of data are to be validated (for e.g. the doctor should be able to say, 'Attend to the patient, don't validate,if it is an emergency without bothering to register him/her to the hospital repository), to be computed (for e.g. no taxation for income levels lesser than a certain amount), to be transformed ('Get all those long binary strings and convert them to giga object types).
I should also be able to inherit validation, transformation or security rules because they have all been already defined and running in all my heterogenous systems for years!
I understand that 80% of SOA effort is towards Exception Handling. That's logical because SOA itself requires an integration layer inbetween the model and presentation layer and due to the variety of different interfaces and due to their complexities the exceptions can be plenty. But they all need to be handled as beautifully they were handled in their original stand alone systems earlier if not better.
The whole thing looks very promising but as I sit down with the requirements, I'm bogged down because it looks complex. User experiences will dictate a lot of design in this area and the end-user is the King!
Ideal Browser world and rich applications
Wednesday, November 02, 2005
Ideal Browser world and rich applications
Today I experienced a very current and relevant problem. I was looking for a software with which I could design virtual homes. Basically I needed an app that would draw a house plan, take my inputs for furnishing, structures etc and give me a 3D view of the whole thing. I needed it to be simple to use yet have complex (not complicated) functions. For e.g. I need a moon roof which should not slope down but be flat!
I searched the internet and found two options. One, there were tools that provided this functionality over the browser. That is, I subscribe a hosting fee and I can draw, paint and dar and drop through an applet on a browser. The second option, was to buy a software on a CD that would install as a desktop client. I used both and my pain points were all relevant to what problems we have in the industry today.
The issues in the browser based tool were these: The internet with great broadband is still not that fast esp. with latency and variable speed. So, if I drag and drop a refrigerator object in my bed room plan, it got dropped after a good 30 seconds. So, request and response with HTML and HTTP was good but it did not match my creative abilities.
The quality of the GUI - well, how much rich can you make an object that is brought (or dragged) all the way from the server thousands of miles away? So, the app designer does have to make a trade off. Can't the object just look 'like' a fridge-comeon, after all this is just the plan..not your house! So, the component called refrigerator is rectangular in shape and has a grid across it and user is expected to decode that as a refridgerator. Colors? Well, just keep a default color for now, okay.
Can I have many GUI components - may be different carpets, diff tiles, diff tap fittings, diff draperies......whew..no way. They are really fat components and do you want to hog bandwidth dragging them all the way? They will hog memory too.
So, my GUI components are BLAFed (Basic look and Feel) and are not really comprehensive in variety.
Third, what about storage, reuse and reliability? If my internet disconnects, does it journal it? Can I start from the same point that I left off? Can I reopen my plan, copy it, version it, have some security and authentication on it......................hey hey comeon, you are talking all things that you want in an enterprise software!
No, the browser based 'host' did not provide any of these.
But yes, I had advantages such as I could manage with lesser memory. I just needed a terminal with internet connection and a printer.
Okay, I shifted to the CD based client software. Boom, it installs so beautifully! But, wait how many files does it need to install? oops, so many picture files,whooooooo- so many .jar files.....oh man, why is it downloading another JDK version- I have a compatible one here, can't it detect? So, I finally run the executable and there si it. I gape at the client- wow! So beautiful. Such easy to use wizards, so many options that I can use. I just drag and drop a fully furnished bathroom and lo, I can see it on 3D. It looks beautiful.
But it suddenly vanishes-what happened? I go to task manager and open processes. It's hogged about 512MB RAM and there are about five or six processes running. I start again after killing the processes myself. I do this many times, struggling with rebuilding it every time it does the vanishing act. Then I look up internet and buy a even better, more reliable software. Now I try to open my old house plan with my new software (what a dumb thing to do, whoever said that apps work from one software to another) and it core dumps.
My requirements for this browser BLAF-Rich Client software developers is just the same set. I need a standard way of rich components on a browser, rendered via request and response, with all the functionality that the rich client provided plus the convenience of using it over any-PC have-internet technology. I will not do any installation, configuration on my PC. I need to move over and drag over components like a glider. The components have to look life like - if I say I want a teal sofa then it should look like the teal one. If I want veneer finish, well it better look like what I saw at the store. I need to get a REAL feel of it. I definitely want failure restoration (cold failover and hot failover), versioning, team development (I might be developing one room, my friend could be developing another room from his PC) and be able to save certain features as my 'styles' or profiles.
And last but not the least, I need to be able to see same kind of file type, component types, XML tags.....(standards :-)) so that I can open this, save this using any vendor's software as long as I pay the subscription fee.
SO, how does this look? Am I demanding? Not really, make it all simple!
Let me know what you think by sending me an email.
Ideal Browser world and rich applications
Today I experienced a very current and relevant problem. I was looking for a software with which I could design virtual homes. Basically I needed an app that would draw a house plan, take my inputs for furnishing, structures etc and give me a 3D view of the whole thing. I needed it to be simple to use yet have complex (not complicated) functions. For e.g. I need a moon roof which should not slope down but be flat!
I searched the internet and found two options. One, there were tools that provided this functionality over the browser. That is, I subscribe a hosting fee and I can draw, paint and dar and drop through an applet on a browser. The second option, was to buy a software on a CD that would install as a desktop client. I used both and my pain points were all relevant to what problems we have in the industry today.
The issues in the browser based tool were these: The internet with great broadband is still not that fast esp. with latency and variable speed. So, if I drag and drop a refrigerator object in my bed room plan, it got dropped after a good 30 seconds. So, request and response with HTML and HTTP was good but it did not match my creative abilities.
The quality of the GUI - well, how much rich can you make an object that is brought (or dragged) all the way from the server thousands of miles away? So, the app designer does have to make a trade off. Can't the object just look 'like' a fridge-comeon, after all this is just the plan..not your house! So, the component called refrigerator is rectangular in shape and has a grid across it and user is expected to decode that as a refridgerator. Colors? Well, just keep a default color for now, okay.
Can I have many GUI components - may be different carpets, diff tiles, diff tap fittings, diff draperies......whew..no way. They are really fat components and do you want to hog bandwidth dragging them all the way? They will hog memory too.
So, my GUI components are BLAFed (Basic look and Feel) and are not really comprehensive in variety.
Third, what about storage, reuse and reliability? If my internet disconnects, does it journal it? Can I start from the same point that I left off? Can I reopen my plan, copy it, version it, have some security and authentication on it......................hey hey comeon, you are talking all things that you want in an enterprise software!
No, the browser based 'host' did not provide any of these.
But yes, I had advantages such as I could manage with lesser memory. I just needed a terminal with internet connection and a printer.
Okay, I shifted to the CD based client software. Boom, it installs so beautifully! But, wait how many files does it need to install? oops, so many picture files,whooooooo- so many .jar files.....oh man, why is it downloading another JDK version- I have a compatible one here, can't it detect? So, I finally run the executable and there si it. I gape at the client- wow! So beautiful. Such easy to use wizards, so many options that I can use. I just drag and drop a fully furnished bathroom and lo, I can see it on 3D. It looks beautiful.
But it suddenly vanishes-what happened? I go to task manager and open processes. It's hogged about 512MB RAM and there are about five or six processes running. I start again after killing the processes myself. I do this many times, struggling with rebuilding it every time it does the vanishing act. Then I look up internet and buy a even better, more reliable software. Now I try to open my old house plan with my new software (what a dumb thing to do, whoever said that apps work from one software to another) and it core dumps.
My requirements for this browser BLAF-Rich Client software developers is just the same set. I need a standard way of rich components on a browser, rendered via request and response, with all the functionality that the rich client provided plus the convenience of using it over any-PC have-internet technology. I will not do any installation, configuration on my PC. I need to move over and drag over components like a glider. The components have to look life like - if I say I want a teal sofa then it should look like the teal one. If I want veneer finish, well it better look like what I saw at the store. I need to get a REAL feel of it. I definitely want failure restoration (cold failover and hot failover), versioning, team development (I might be developing one room, my friend could be developing another room from his PC) and be able to save certain features as my 'styles' or profiles.
And last but not the least, I need to be able to see same kind of file type, component types, XML tags.....(standards :-)) so that I can open this, save this using any vendor's software as long as I pay the subscription fee.
SO, how does this look? Am I demanding? Not really, make it all simple!
Let me know what you think by sending me an email.
SOA and ecommunity
Saturday, September 03, 2005
SOA and ecommunity
Today I'm wondering what drives an e-community! The need to socialize in a structured manner hasn't gone away. Well, most of us want friends or acquaintances who have something in common with us. May be interests or hobbies, maybe similar age groups and similar challenges in life, or similar life or behavioural patterns. Is an e-community service just for the hippie and funky? Definitely not - no service can really take off if it is not for the overall benefit of the larger society.
Most of our activities are centered around discovering services which are available in a directory. This is essentially what an UDDI registry based service does. For example, find those trains that travel via a particular route, are tickets available, what kind of food will be availabe at the train stations and can I get that information based on my eating profile, and during my transit stays can I get some movie tickets - again please look at my profile to see what movies I like and I would like to meet some of my friends from my alumni located at my destination city. Can i meet them? They aer all buddies in different e-communites some in AOL, some in Yahoo, some of google - can I see them all in one place and can I establish a meeting between us and book a restaurant table too?
The options are mind boggling. It looks like an utopian dream. But when businesses and services structure their data representation abstracted for web services and if they can register their services in a standard fashion and publish them, it's going to be much easier to do many routine things we spend time on today.
Let us wish good luck for overselves!
SOA and ecommunity
Today I'm wondering what drives an e-community! The need to socialize in a structured manner hasn't gone away. Well, most of us want friends or acquaintances who have something in common with us. May be interests or hobbies, maybe similar age groups and similar challenges in life, or similar life or behavioural patterns. Is an e-community service just for the hippie and funky? Definitely not - no service can really take off if it is not for the overall benefit of the larger society.
Most of our activities are centered around discovering services which are available in a directory. This is essentially what an UDDI registry based service does. For example, find those trains that travel via a particular route, are tickets available, what kind of food will be availabe at the train stations and can I get that information based on my eating profile, and during my transit stays can I get some movie tickets - again please look at my profile to see what movies I like and I would like to meet some of my friends from my alumni located at my destination city. Can i meet them? They aer all buddies in different e-communites some in AOL, some in Yahoo, some of google - can I see them all in one place and can I establish a meeting between us and book a restaurant table too?
The options are mind boggling. It looks like an utopian dream. But when businesses and services structure their data representation abstracted for web services and if they can register their services in a standard fashion and publish them, it's going to be much easier to do many routine things we spend time on today.
Let us wish good luck for overselves!
Evolution of software developer
Wednesday, August 03, 2005
Evolution of software developer
Life is becoming easier with technology, rather for the Application developer! About a decade ago, a good developer was one who invariably came from computer engineering background. Usual interview questions would revolve around order of an algorithm, traveling salesman problem or graph and automata theory! I was a proud engineer who reveled in all of these. Not for long. I slowly became witness to the software world leaning on the 'next door boys (or gals)'. No, not all jobs really needed brilliant computer engineers. We needed 'average' guys for jobs that did include development, testing, technical writing, sample writing, porting...etc. And sooner than later it was proved that all these guys who came with an MCA degree or Electrical engineering or Mechanical Engineering degrees were really valuable since they picked up stuff pretty fast. Of course, the job did not require them to know discrete mathematics and regular expressions.
The software world looked towards developers who understood the domain, who could quickly draw up screen designs using tools like Visual basic or Oracle Forms, who could rapidly write business logic using standard functions and packages and who did not have to bother about writing tuned and performant database queries because there were tools already for that!
Now, the situation has become even easier! Yes, with declarative development, all that the developer has to do is...'DO'! He (she) does not have to know the intricacies of code, syntax. If the application needs a calendar date to be picked up by a calendar, well no code needed. Use the Rich UI component library, drag and drop the calendar component. Do you want an on demand audio player to be developed? Who really needs to store media files (those songs) in binary format, retrieve those as streams, parse... Come on, are you in Stone Age? DRAG and DROP, rather DnD! Hey, haven't you heard of life cycle components?
Leave alone silly things such as changing colors, report layouts, page breaks- they are all done declaratively.
No, again, don't worry about memory consumption, bandwidth consumption - after all how will storage management and performance companies survive? Or router companies innovate? Give them a chance, write n-tier architecture code, compile them in a 'Team development' environment, and deploy them on a heavy web server and hog memory and bandwidth. No problems.
Okay, what the heck is the developer supposed to know? Well, he should surely know how to pick those exact tools he needs given that he is spoilt for choice. He should know usability and aesthetics and customer behavior patterns. For example, if you want the customer to install your software, are you going to ask hundred questions to fill up before that? Or would you err on the side of courtesy? Or on the side of security? Ease of use versus effectiveness? Do you provide a heavy rich client and provide him everything on the thick client or do you have to develop servlets to send the logic back to server and keep the client thin?
So, the developer has at once developed into a super developer. He has to think so many things ahead of design, which his geek-predecessors did not bother about. Let us give the devil it's due.
Did you enjoy reading this?
Evolution of software developer
Life is becoming easier with technology, rather for the Application developer! About a decade ago, a good developer was one who invariably came from computer engineering background. Usual interview questions would revolve around order of an algorithm, traveling salesman problem or graph and automata theory! I was a proud engineer who reveled in all of these. Not for long. I slowly became witness to the software world leaning on the 'next door boys (or gals)'. No, not all jobs really needed brilliant computer engineers. We needed 'average' guys for jobs that did include development, testing, technical writing, sample writing, porting...etc. And sooner than later it was proved that all these guys who came with an MCA degree or Electrical engineering or Mechanical Engineering degrees were really valuable since they picked up stuff pretty fast. Of course, the job did not require them to know discrete mathematics and regular expressions.
The software world looked towards developers who understood the domain, who could quickly draw up screen designs using tools like Visual basic or Oracle Forms, who could rapidly write business logic using standard functions and packages and who did not have to bother about writing tuned and performant database queries because there were tools already for that!
Now, the situation has become even easier! Yes, with declarative development, all that the developer has to do is...'DO'! He (she) does not have to know the intricacies of code, syntax. If the application needs a calendar date to be picked up by a calendar, well no code needed. Use the Rich UI component library, drag and drop the calendar component. Do you want an on demand audio player to be developed? Who really needs to store media files (those songs) in binary format, retrieve those as streams, parse... Come on, are you in Stone Age? DRAG and DROP, rather DnD! Hey, haven't you heard of life cycle components?
Leave alone silly things such as changing colors, report layouts, page breaks- they are all done declaratively.
No, again, don't worry about memory consumption, bandwidth consumption - after all how will storage management and performance companies survive? Or router companies innovate? Give them a chance, write n-tier architecture code, compile them in a 'Team development' environment, and deploy them on a heavy web server and hog memory and bandwidth. No problems.
Okay, what the heck is the developer supposed to know? Well, he should surely know how to pick those exact tools he needs given that he is spoilt for choice. He should know usability and aesthetics and customer behavior patterns. For example, if you want the customer to install your software, are you going to ask hundred questions to fill up before that? Or would you err on the side of courtesy? Or on the side of security? Ease of use versus effectiveness? Do you provide a heavy rich client and provide him everything on the thick client or do you have to develop servlets to send the logic back to server and keep the client thin?
So, the developer has at once developed into a super developer. He has to think so many things ahead of design, which his geek-predecessors did not bother about. Let us give the devil it's due.
Did you enjoy reading this?
Subscribe to:
Posts (Atom)