the value of an upgrade vs keeping things the same

Having been an admin in IT, I understand the pressures to keep things running and don’t mess anything up. There are two approaches to this. The first is to do nothing. If it works, don’t fix it. This is a great idea but the problem is that as things age, it gets more difficult to keep operational. When I worked at a local university, we used an old desktop to handle the web site for the university. The desktop was an old Sun SPARC machine that had enough memory to run an Apache web server. On the positive side, it worked and the university home page was stable and reliable. On the negative side, we didn’t have spare parts, new security patches, or an alternate platform to run the service on. We were at risk. The reason that we were at risk is that we opted to do nothing because it worked.

The second approach is to constantly upgrade systems. This tends to increase cost and cause instability due to change. Constantly changing systems requires time on the admins part and training for the end user. Upgrades also tend to bring in new features and functions that might or might not reduce time. For example, if we update the Apache web server and include PHP as a new option because it is now a standard template for all web servers, we need to be able to support developers who do not know how to use PHP and provide training on shared resources and development. Additional cost for the IT with no new revenue stream.

I mention a standard template in the second approach. I personally think that this is important. I found it very difficult to have 25 different implementations of the same product. When something broke or needed patched, it it took significantly more time to update the system. If all systems are the same, it just requires repeating the steps 25 times and not recreating everything from scratch.

A recent Gartner study looked at cost cutting opportunities for 2009 in IT. The assumptions presented were worth noting. Assuming that a company generates $1B in revenue, they typically spend around 5% on IT. This correlates to $50M in IT spend. The majority of these costs are support cost for hardware and software. 22% is associated directly with the data center. 5% with help desk, and 13% with desktops. Application support, 16%, and Application development, 20%, make up a large chunk of the operational cost.

Info Tech looked at how things can be changed. Travel and training were the low hanging fruit. Changes in staffing and consultants/contractors are easy to do but negatively impact revenue. Outsourcing and changing renegotiating contracts take longer and have significant impact on cost. Data center consolidation and process change also take longer but have less of an impact on cost.

One of the recommended ways of reducing cost is virtualization and consolidation. In the past people have purchased one server for an app and purchased the server to meet peak load times. If you purchase smaller systems and cluster them, you can scale performance as needed and use the idle resources for other things during non peak times. The management issue at this point becomes peak management and demand management. It does increase administration requirements but significantly reduces capital spend and hardware support cost. This is an easy analysis to do for larger companies that use relatively large server with 8 or more processors in a single system. An example of this is Oracle Education. We use RAC systems on the back end for a variety of classes and OracleVM to provision prepopulated classes. We were using snap shots and copying from system to system and dedicated database servers for classes. When we went to RAC and OracleVM, we reduced the hardware by a factor of 6 and increased CPU utilization from 7% to 73%. Revenue per server increased by 5x which means that each class becomes more profitable without substantive changes in the way that we do business.

what does it take

I was fortunate enough to go to the third playoff game of the Rockets vs Lakers game on Friday and a question came up after the game. What does it take to make an event fun and exciting? There were some people that came to the game and sat there stone faced not seeming to enjoy the game. I personally had a great time because I don’t get to go to events like this much. Given that we had free food, drinks, parking, and the game, what else would it take to make someone enjoy the evening? I realize that this was a “vendor sponsored event” and that has implications and ramifications but we didn’t talk business much and talked more about families and the game.

The key reason that I ask this is because it has implications on everything else. It all comes down to expectations and delivery. If you expect to have a bad time, there is nothing that you can do to make it fun. If you come in with an open mind, anything is possible. With Cub Scouts and Boy Scouts, many of the kids feel like it will and should be fun. The same is true for birthday, mothers day, valentines day, etc. When talking to someone trying to buy something, if the expectation is that they will pay too much of that the product is too expensive, no one will be happy.

I keep thinking back to a golf outing that we had months ago. We had a big group and got paired with some technology guys. When we got to the first tee, I clearly stated that I was glad not to be in the office and was looking forward to a good day of golf. If they wanted to talk business, I would freely talk about things. If they just wanted to enjoy the day, we would relax and not worry about talking technology. It turns out that we talked more about business because they relaxed and really had some legitimate questions.

it has been years since I have used StarOffice

When I worked at Sun, it was a constant struggle to use StarOffice because not all of the Microsoft Office features were implemented in StarOffice. Given that Sun would not pay for a Microsoft license, it was either use it or figure out another way. Well, five years later and two versions of each product, it is nice to see that nothing has changed.

I downloaded the current version, StarOffice version 9, and it is nice to know that things remain the same. The word and draw products are good and reliable. The calc still falls short in my opinion. I have been using Crystal Ball for some simulations and decision tree analysis. I have found that the tool is very powerful and good at looking at project funding and financial analysis of purchasing goods and services. Unfortunately, it does not work inside of StarCalc.

The same is true for Visio. The StarDraw product can not read nor write vsd formats. Given that most of the stuff that I work on is proposals for IT departments, it is important to do network diagrams, server room diagrams, and connectivity or data flow models, much as I don’t want to admit, Visio is a defacto standard. Not being able to read this format is a show stopper in my opinion.

I have also been playing with the Oracle digital rights software, the IRM server. Unfortunately, the ability to seal a document appears in the Excel interface but not the StarOffice interface. The document must be sealed outside the tool which means that it needs to be stored unsealed and then sealed. If I am going to pay for a suite like this, I want the software to seal and unseal documents inside my commonly used tool.

The final problem has always been storing the files in a different format. It is problematic to ship someone a star office formatted file and they have Microsoft Office. You can open a document from Office in StarOffice but not the other way around. Until StarOffice becomes the default platform and storage format, it will continue to be difficult to use.

Hopefully, Oracle can solve this problem and integrate some of the back office calcualtions and procedures into the tool making it the front end customization piece rather than having to do something like Oracle Forms or Web Center. It would be nice if a tool like Excel/StarCalc could be the navigation tool into EBS or PeopleSoft and have Hyperion and Crystal Ball reporting all nicely integrated while being able to save into secure digital protection offered by an IRM server.

Given that I work for Oracle, these are merely my own opinion and not direction of the company. I am not a high enough pay grade to even have an opinion or influence in matters this important. I can only wish……

Oracle acquiring Sun

This morning it was announced that Oracle will acquire Sun Microsystems took me by surprise. I worked for Sun for 13 years and have been at Oracle for almost 3. These are my notes from the briefing call that happened this morning.

This was a decision done at the highest levels. Larry Ellison and Scott McNealy were both on the call. The call started with Safra Catz talking about the financial analysis. This deal is larger than Hyperion but smaller than BEA and PeopleSoft. The deal expect to have a higher margin and add $1.5B in non-gaap income. The income will be 15 cents per share after the first year. This is more profitable than the other acquisitions. Oracle has the experience of doing this and we will combine the software apps quickly. The combined company will focus on joint customers and help reduce cost and complexity. Sun currently outsources all manufacturing and post-merger Oracle will manage this to maintain profitability.

Larry Ellison took over and talked about the success of the history of acquisitions. He focused on Solaris and Java as the key technical reasons. Java is the best known brand and is one of the most widely used platforms. Our middleware technology is based on this foundation and moving forward we can customize both to make them work even better together. Sun’s largest business is the SPARC server component. Solaris is the best Unix technology available in the market. Oracle databases is deployed on Solaris more than any other platform. The next platform is the Linux operating system. With the acquisition of Solaris, Oracle can optimize the leading operating system as well as the leading database to reduce complexity and improve performance. Both of these products are based on open technology. The combination of these two products will allow a growth of footprint and make it easier for customers to deploy complete and integrated hardware and software systems. Tuning Sun systems for Oracle will help benefit existing customers

Scott McNealy followed this with the history of Sun and Oracle and the common innovations that have been done in the past. Both companies are key innovators and this innovation will continue moving forward. The joint company will allow companies to focus on what is important to them, running their business not running their data centers

Jonathan Schwartz spoke about the ability to address a broader array of platforms to talk to a larger number of technology problems. Oracle now can drive an industry phase change into the industry while remaining based in open technologies.

Charles Philips talked about the CIO advisory board from weeks ago and the consensus was that our initial foray into the Exadata/hardware solution was good and that we should do more. Tighter integration with hardware and software reduces the complexity and support issues with business solutions. Ready to deploy servers targeted at industries are one possibility. Sun’s open storage platform is very similar to Exadata solutions. Storage is a continuing and expanding market for years to come. There are exciting and new opportunities with Java. We will have the largest software development community in the world

Overall the conference call was simple, to the point, and stuck to the facts. There was no Q&A after the call, only the announcement. My personal opinion is that this will generate a significant amount of discussion. There are some overlays of products in the identity and Linux areas. It is interesting that Oracle now owns SleepyCat/BerkeleyDB as well as MySQL. It is also interesting that Oracle also owns JRocket and Java. My gut feel is that the Glassfish technology will come into play for the business users and it will introduce new innovation into the fusion suite of products moving forward. This will confuse things with Microsoft moving forward as well as a few other software partners. For me personally, I have been talking with a bunch of friends that I haven’t talked to in months.

five years from now…..

Some days I wish I had a crystal ball and could predict what will happen five years from now. If I could look into the future, I would know what to do today. Unfortunately, five years from now I know a few things are given but a few things are still hazy. For example, I know that my 2002 VW Jetta will be gasping it’s last breath and my daughter will need to replace it during her freshman year in college. I know that my wife’s three year old car will be eight years old and probably on it’s second clutch by then. I’m not really sure what my next new car will be. Electric? Hybrid? Another Diesel? Who knows. That decision isn’t until December of this year.

Right now I am looking at getting a new home computer at the end of the summer. Since my daughter is starting high school, should I get her a laptop? Her new school provides computers so why get her one? Should I invest in a cloud computer running Windows and let her remotely log into it? The cost of an Amazon Windows system is $100/month which comes out to $1200/year. I can buy a pretty nice computer for that much considering I get a new computer every three years. That says that I could spend $3600 on a new computer today and save money. Sigh, if I could only convince my wife that the $2700 Mac Laptop is a cheap solution, I would have one now. The big drawback to a hosted Windows system is that things like USB drives and audio don’t really work well remotely. Considering the integration of iPod devices into this environment, I guess we will need to stick with a physical device at home.

Speaking of iPods and iPhones, I commented on this in my personal blog, It seems like the iPhone is becoming more and more part of my life. We just released an ldap application that allows me to look up the office phone and cell phone of anyone in the company. It also gives me a photo of the person. Unfortunately, it does not allow me to save this into my contact list locally. Next version I guess. I wish I could predict what my phone would look like in five years. It would make things much easier. Will it be just a slimmer version of the iPhone? Will it be something like the Shuffle and be voice activated? Will it be more like the Kindle and a hybrid between a tablet, a book reader, and a cell phone? My dream device would be a color Kindle with 1T of storage that recognized voice commands, does bluetooth stereo, allows me to connect with a VPN client, plays my iTunes songs, and connects to my virtual server instances in the cloud. My gut tells me that we are a couple of years away from something like this, not five years. Oh yea, did I mention that I want it to integrate with my new car so that I get a heads up display on the dash or windshield that is voice activated?

It is fun looking into the future and dreaming. Unfortunately, not many people are doing this these days. Many are looking over their shoulder and trying to keep a low profile. My recommendation is that now is the time to be bold. Do something different and something that will get you noticed. Play with some new software. Start a new skunk-works project that might make a difference some day. I personally have been playing with the Amazon Cloud because for very little money I can install some of our technology and see how well it works for me and for some of our customers. I am using it to teach some seminars and to do some proof of concepts. Some of my work has gotten noticed.

Five years from now what are your dreams? Staying at the same job and being secure? Dealing with the change that others force upon you? Are you leading the charge or are you the Cobol programmer that keeps the old stuff running? Personally, I want to be the change agent. I drive a Diesel VW because it gets better gas mileage than a Prius. I blog and have created my own wiki pages internal to Oracle to encourage people to contribute. I play with new technology to see if it enhances my life or makes it worse. I can honestly say that the iPhone has improved my life. I just need to stop using it at the dinner table and talk with my family.

The Amazon Kindle as a Tech Device

I have always been intrigued by electronic books. I have moved away from a book shelf of technical manuals and gone to a disk full of pdf files that do the same thing. I was a long term Safari subscriber back when it was only O’Reilly books. As a developer, it was invaluable. The search capabilities were just what was needed to get started and find a solution to a specific problem. The ability to download a chapter was perfect. It allowed me to take some reference material with me when I was doing some consulting work or traveling. With the announcement of the Amazon Kindle 2, I thought it might be worth researching. Fortunately, I know someone who does not mind too much if I borrow her Kindle as long as I give it back when I get home from work. I’m glad my wife is very understanding.

Some of the features of the Kindle that I like are:

1) the ability to get pdf docs onto the device for display and searching. I can download manuals and tutorials on software. It becomes a portable library. I can download hardware and software documents that go in to more detail that I care about. Getting the pdf docs there is a little tricky. I wish you could download it and copy it via USB connection but it needs to be converted and Amazon has an automated way of doing this through email. If you email the document to your Kindle address, it converts it and puts it on your home page.

2) reading blogs from the device is very easy. You can define search patterns and blog links using the Google Reader and put this as a bookmark. The Kindle has a browser that allows you to pull in web content. The browser is not full function and does have some limitations. It does not do flash sites, does not support ActiveX, and has limited support for JavaScript. I tried posting this blog from the Kindle and could not because the login button would not work. It handled the username and password as expected but would not follow the Go button to pass these values. It also caches passwords and does not prompt for a password if you tell it to remember me on this device. This is a good and bad thing because the Kindle itself is not necessarily a secure device. I’m sure the corporate security guys would ban using it inside or outside the firewall to connect to corporate resources. Unfortunately, I haven’t figured out how to get VPN working on the device so connecting to corporate web sites hasn’t been a problem.

3) downloading training audio is supported but the mp3 player on the device operates like an old iPod Shuffle. There is no user interface to select what you are listening to and no way to rewind and listen to something that you missed. It is a neat feature and hopefully the v2 release will provide a richer user interface for audio.

4) I did a simple search in the Kindle store and there are a large number of Oracle books available, 98 using just the word Oracle. All but one on the first page was related to Oracle products. If I follow the related search for “oracle 11g” it returns 12 results. Following “oracle database” returns 59 results. Unfortunately, the $9.99 price tag for most hard cover books does not translate to technical books. The OCA Oracle Database 11g: Administration I Exam Guide, for example, is $38.87. This is understandable but I was hoping for a cheaper way of getting the book.

Overall, I want one. I like the idea of having a portable reference library in just under 11 ounces. Like the iPod, the cost isn’t in the device. The value is in the content and the ability to access information. I can see spending twice what the device cost for content. I can also see my wife wanting her Kindle back tonight so I better not get too attached to it.

creating and IRM service in the Amazon Cloud

Ok, I am a glutton for punishment. We don’t have any hardware to play with in our office. It turns out that very few people have extra hardware laying around. I guess I got spoiled working at a University for as long as I did. We always had something from the 90’s laying around that we could use to play with and when virtual machines became trendy, it was easy to get a VM to prototype anything that you wanted. Now that I have been at Oracle for a while, I find it a little difficult to find a VM to play with and test software. Thus my current project. How do you get an IRM server running in the Amazon Cloud and get all of the services running properly. I did run into a few challenges, the following describes many of these hurdles.

1) the software only runs on Windows. I am a Unix admin so I feel lost initially. Some of the things that I stumbled into are trivial issues for Windows admins. I had to start a Windows Server 2003 instance and get it running with remote desktop. Given that this is different from VNC, it took a little learning to figure out what ports need to be opened and how to connect. Needless to say, it would not work through the corporate firewall and I reluctantly had to spend hours and hours working from home instead of commuting to the office

2) I had to remove components from Windows to get things properly working. IRM does not install properly if .NET is installed. It works fine but it has problems installing. I had to look at Metalink and the forums to find this out. This was easy because I know how to remove components.

3) I had to add components to Windows. The IIS service with SMTP needs to be configured and running for the management interface to properly operate. I would have preferred running Apache but that wasn’t an option so I had to figure out how to first install then configure IIS. The installation was a little difficult because running in a virtual service, it is a little difficult to mount a DVD and point my instance to the software. Fortunately, Amazon thought of this and provides a snapshot of the DVD with SNAP ID snap-8010f6e9. This is the Windows Server DataCenter Edition of 2003. All I had to do was select IIS and SMTP service and begin the install. It did ask me to find a couple of files, but that was relatively trivial. Once I got the IIS service installed, I had to configure it to send email from a drop directory. At this point I went on the web and found since I wanted to use my gmail account as a relay since Oracle only allows email to be initiated from within the firewall.

3) Once I had the web service up and running, I had to install a database. For this installation I installed the Oracle Database Express Edition. This is adequate for a test system but I might want to install Standard Edition or Enterprise Edition to make it part of my backup and failover strategy. This was relatively trivial but did require that I install the client side software to allow for ODBC connections. I could not get an ODBC connection when I default installed XE.

4) Once I had the web service and database up and running, I had to install the IRM server. This was relatively trivial as well. I did need to remember where I installed the IIS service and the pickup location for email as well as the ODBC connector for XE. Given that this was my first install, this is where everything went wrong. I could not get email to work because I forgot where I defined the pickup directory in IRM and IIS. I again had to search the web and look for ways of testing email drop directories. I got this working by simply creating a file in the drop directory which in my case was C:\Inetpub\mailroot\Pickup (tough one, I know) and watching to see if the file disappears and shows up in my email. Fortunately, it did because I configured the IIS server properly using the IIS Manager that gets installed with IIS. The next problem was where does IRM define the drop directory. I looked everywhere and had to ask the expert (Simon Thorpe) how to find it. It turns out that there is a file c:\Inetpub\wwwroot\SealedMedia Management\smpweb_config.txt that defines the drop directory. I would have never looked there. I installed the software in e:\Program Files\Oracle\Information Rights Managerment\IRMServer. I guess this is the difference from a program written for Unix and a program written to work with IIS. I would have create a symbolic link from the install directory into the IIS root and made it simple. Once I got this working, I was able to login and create an account. The account creation sent a confirmation email to the user (which I could verify) and everything looks to be working.

Now that I have the service running, I can let some local customers play with it. The management console and web console are good interfaces that allow me to define contexts and have them play with managing, sealing, and notifying people of changes. I don’t need to give them access to the operating system because the service is self contained. Overall this was a good learning experience.

Windows on Amazon EC2 – my experiences

when I first started playing with Windows on Amazon, two things caught my eye. First, it operates differently from Linux. Second, configuration was just as easy as Linux and it looks like a nice environment to play around in.

Before I started playing, I took a look at my Amazon web service account and figured out that I either configured something wrong or someone was using my account to run their own service. My bill in November was $42 and in December it increased to $82. I thought that this was a little unusual because I didn’t use the service for all of December. I logged a service request and 12 hours later, haven’t heard anything back from my request. To solve the problem, I deleted my S3 stored services and changed my encryption keys and auth keys to get into the system. Once I did this, the activity stopped. I don’t know if I killed a rogue process or someone using my service got terminated.

The first thing that I had to do was try to launch everything and avoid doing any research. Well, this got the expected results. I had some instances launch but I could not get to them. With Linux, I used ssh to connect to the instance and open a VNC session to display the console back to my desktop. I didn’t think that Windows would do that but I thought that I would give it a try. Needless to say, the console connection failed. It was interesting that it did attempt to connect using remote desktop and failed on the connection. At this point, I decided to do some research. I found another blog () that described exactly what I was looking for. You have to add port 3389 to the security profile and start the service.

I did this and relaunched the instance. It turns out that it worked. Unfortunately, it asks for a password. Since I was using elastifox I was able to get the admin password by right clicking on the instance and having it copied into my clipboard. I was then able to retype the password into the login and change the admin password on the Windows 2005 instance and do things like create accounts, surf the web, and shut down the instance. It worked as expected.

It looks like the charge for running a vanilla Windows instance will come out to be $90/month. This is a little more expensive than the $70/month that Linux comes in at but better than having to get a Windows license and install it on a box in my house or office.

Next up, I get to try the S3 storage service with Windows and with Linux. I wonder if there is a way to have one storage instance that can be mounted between both operating systems.

do I really need Veritas Cluster software or Sun Cluster software

A common question these days comes up when we start taking about replacing older multi-processor systems with newer systems. Many people want to look at getting rid of their cluster software, either Veritas or Sun cluster software, because the maintenance prices are relatively high. The question comes up, do I need to purchase RAC to use the Oracle Clusterware ( In short, the answer is no. You do not need to purchase RAC software to use Oracle Clusterware. It is a free software package that you can download for most operating systems.

The next question that comes up is how does it compare to Veritas and the Sun suite. I typically refer everyone asking this question to:
a whitepaper
an example of how to protect non-Oracle apps
an article on how to protect an application server
sample code to keep an http server up and running
– and a support forum

The net of all these references is that you can create a farm like instance of an application and run it across multiple computers. These computers run an instance of the application and restart the application if it halts or fails on one computer. It is important to note that it does not maintain the state of the application but does make sure that the application runs on at least one of the servers in the farm. If, for example, you want to make sure that you have a web server running that is presenting static information, you can attach multiple computers to an nfs mounted file system or network share disk and run the httpd on one of the computers in the http farm. If the httpd fails, it is restarted on the same or different computer. If this web server hosts dynamic information like a shopping cart, the application needs to make sure that state is stored in the database repository and not the memory of the web server or the shopping cart will be lost. If you have an application that keeps state in memory you will need something like Coherence to share memory between instance of an application server. Some prime examples that you typically want to keep up and running is an ldap server, a database listener, a dns server. Any service that connects to a repository to look up data and report the results back to the client is what is desired.

What is the key difference between a server farm and a cluster? A server farm is a group of servers that respond to a request in a round robin or load balanced configuration. This typically requires an expensive router to manage this service. Cluster ware is a software management system that performs the same function using the operating system and resources available to it to provide high availability for a service.

The key sub-components that make this software work are: virtual IP addresses, cluster ready services, and cluster synchronization services. A virtual IP address (VIP) is needed so that application connect to a single IP address and not a physical machine in the cluster. The cluster software manages which physical machine answers the virtual address. The cluster ready services launches and monitors applications on the various nodes to make sure that requests are being processed and answered. The cluster synchronization services manage shared resources like a disk or a network resource and acts like an arbitrator and broker if any node gets confused.

In general the cluster software allows for loosely coupled systems to act as a tightly coupled system for a given application. The three components run on all nodes and communicate with each other to manage the application running on one or multiple nodes. This configuration can respond to load balancing requests or just make sure that at least one node is available to respond to requests. Given that is a free software package, it is worthy looking at as a general purpose tool instead of paying for services from Sun or Veritas.

The cost of business intelligence (or not)

I have sat in on a couple of meetings with financial departments and listened to the need for reporting. This seems to be the topic of the year. The struggle is how to do this and how to fix what has been done in the past.

First, let’s look at what has been done in the past. The key reporting tool historically has been something like Discoverer. Everyone I talk to has a love/hate relationship with this tool. They love the information that comes out of the tool. They hate it because it makes concurrent manager more complex to run and drives up your hardware and license cost by increasing the horsepower needed to run your general ledger. The key problem with this solution is that it does not scale. The more reports that you run, the more overloaded your system becomes. The more you use it, the more difficult it becomes to keep up and closing your general ledger takes longer and longer time. One company that tried to reduce their close time quickly realized that they had written custom interfaces that pulled data out of E-Biz, put the information into a spreadsheet, performed a calculation and manually entered the results of the calculation back into E-Biz. When they initialized the project they quickly realized that it wasn’t the process that was broken, it was the data that was invalid. Over the years, errors accumulated and got bigger and bigger and eventually hit the bottom line. One division that always appeared to be profitable turned out to be loosing money on a regular basis. An error in a calculation that added instead of subtracting two cells showed a profit and took money from another department.

If we take this methodology to the extreme, you get people who do nothing but come into work and look at the nightly runs of data extraction, verify that the results look good, and manually key the information into E-Biz to reduce the close time. You get hundreds of people monitoring thousands of calculations and an IT department that does nothing but handle failed jobs and bad data being pulled from a variety of sources.

Let’s look at the polar opposite of this. Let’s look at someone who created their own interfaces so that they can capture data as it is entered then push the data into E-Biz. The reason to capture the data is to reduce the Discoverer runs and concurrent jobs. The data is pushed into a spreadsheet as it is entered into E-Biz and updates are run nightly to verify that all of the data was properly collected. This solves the processing power issue in E-Biz but does not solve the overall picture.

Both of these solutions hamstring organizations because it ties custom code either in the forms entry or spreadsheet calculations. When it comes time to upgrade the E-Biz package or install a patch, testing becomes a nightmare. If a new feature is needed, like handling a new currency or language, multiple lines of code need to be changed in different places. If not all places are changed, errors can creep into the system without an ability to trace these errors.

The real solution to this problem is two fold. The first is to have a common API to pull and push data through E-Biz. This can be done through SOA or WebServices. If a financial element is pushed into E-Biz, the currency type is pushed as well. The conversion is done in the WSDL and not in the spreadsheet. The second solution is to have a data warehouse that is populated on a regular basis separate from the E-Biz services. Rather than run Discoverer against E-Biz, you run OBIEE/Analytics against the data warehouse and not have to have as large as a database under E-Biz. This solution also allows you to pull data from alternate repositories that are not Oracle solutions. If you patch or upgrade E-Biz, you change the API interface and tell it how to populate the data appropriately. If you patch or upgrade your third party application, you change the API interface to understand the data change. This allows you to centralize your changes and not have to change things in multiple places.

Politically, this solution is not a valid solution. Given that knowledge is power, giving up the calculation in your spreadsheet or giving up headcount to verify a calculation for your department means loss of budget and resources. Typically a solution like this involves an additional expense, significant consulting and manpower to implement the solution, and loss of power for individuals. What this does is give you a single source of truth and control over who sees what and how much detail that they can get. A solution like this typically requires direction from executive management and investment of resources to document processes and procedures. This can be a scary thing at times and expose stuff that some people want to keep hidden.

I realize that this is a little off topic from what I typically write about but I have heard the same message multiple times from different customers in different industries.