Beyond Excel – SharePoint

Analysts have been using spreadsheet for a long time; spreadsheet is getting more powerful in every upgrade but our demand is even higher. Excel 2013, another major upgrade is around the corner; what is your Excel wish list? Recently, I discussed with a group of financial analysts about what they want to see in Excel 2013. They are power Excel users and want Excel to do more for them. It is not surprised that they put together a long list but I am only going to summarize the most popular items.

  • Data analysis
  • Data visualization
  • Data storage / integrity
  • Data extraction
  • Security
  • Collaboration
  • Programming
  • Control

Excel 2013 customer preview is currently available for download. As usual, it comes with a lot of enhancements. However, Excel 2013 does not address all our needs and our offices may not upgrade to Excel 2013 for another few years. So meanwhile, what can we do to address our needs? How can we effectively improve our efficiency and work smarter? Well, your organization may already have a solution, i.e. SharePoint.

I asked my colleagues if they ever used SharePoint; apparently, most of them have used it but mainly for documents storage. SharePoint is an excellent content management system so it is great idea to use it as an electronic library. However, it is more than just an expensive electronic library; it has many great features for data engineer to transform data into insights. Let me show you some useful SharePoint services.

Version Control


Figure 1 Version Control (from microsoft.com)

Version control allows us to build a version history and keep all previous versions with version information, e.g. when an Excel workbook was created, changed and even comments to describe the changes. It is handy to have access to all previous versions; we can view or compare previous version or even restore to any previous version when we hit a dead end and like to drop all changes. SharePoint automatically maintain all versions and there is always only one file to handle.


Excel Services

 


Figure 2 Excel Services (from microsoft.com)

Excel Services extends Excel from desktop into server infrastructure. It manages security and does all calculations on the servers. Once we upload the workbook to SharePoint, anyone with the required permission can access the whole workbook or pieces from the browser. We still have to use Excel to create workbook but SharePoint allows us to distribute and share the workbook in a more control and secured environment.

PerformancePoint Services

 
 

 

Figure 3 PerformancePoint Services (from microsoft.com)

 PerformancePoint Services allows us to create dynamic, powerful and reusable dashboard; we can create sophisticated KPIs and scorecards that have drill down and up capabilities. One of the handy analytical tools that I like to use is Decomposition Tree; we can easily visualize the root causes of a business problem.



 

Workflows

 
 


Figure 5 Workflows (from microsoft.com)

SharePoint provides a set of pre-configured workflows that automate a set of common business processes in a production environment, e.g. collecting signatures, feedback, or approvals. We can also build our own customized workflows that tailor made for our specific business processes, e.g. workflows that automatically collect business planning workbooks from each business unit, approve them and consolidate into a single workbook. All these business processes can be done with minimum coding in SharePoint.

Conclusion

SharePoint has many powerful features. It allows us to work together on a secured, automated collaboration platform and it grants us critical business insights via interactive dashboards and scorecards. There are many other SharePoint features that I haven’t covered in this article, e.g. Access Services, Visio Services, PowerPivot for SharePoint, Power View…etc.

I often say Excel is a very handy Swiss army knife because it can handle many analytical tasks. SharePoint moves Excel to another level, an enterprise level that often requires tighter control, better collaboration and higher efficiency. Talk to your IT department; it may already be installed and all you have to do is just use it!

   

Beyond Excel – Microsoft Office Access?

One of many Excel’s talents is its data storage; we can store a lot of data in Excel workbook. Excel 2010 64 bit can create a workbook with a size of up to 4 GB. However, Excel is not designed to be a database management system and have many limitations to handle enterprise grade business applications, e.g. each worksheet can only store up to 1,048, 576 row, it does not have robust data integrity and concurrent users support. Therefore, many power Excel users are looking for a better data management software and Microsoft Office Access is often chosen.

Access is a powerful desktop database management system and it is very suitable for desktop applications with a small number of users accessing it simultaneously. Microsoft Office Access is currently bundled with Microsoft Office Professional or you can buy a standalone version for US$ 139. It has 2 main components; a database engine and a rapid application development (RAD) tool that would generate a business application with minimum coding. Microsoft Office Access is definitely a very user friendly, powerful and flexible tool for small and medium business and its license cost is very reasonable.

Despite all these benefits, I have been recommending Microsoft SQL Server to my clients and a lot of people asked me why I prefer SQL Server. One very important reason is that they don’t have budget for an enterprise grade database management system (DBMS); they believe SQL Server is very expensive! Well, what if I tell you that it is free. Microsoft SQL Server has many editions to fit different needs and its Express Edition is FREE, we can download it from Microsoft. Nothing beats free! Yes, it is a trimmed down version but it is still more powerful than Microsoft Office Access in many aspects, e.g. it includes 10GB of storage per database when Access only supports 2 GB; it can utlilize up to 4 cores when Access can only use 1 sinlge core; it provides much better performance and scalability for multi users application. Another key benefit is that SQL Server has multiple editions so we will have a seamless upgrade path when our business grows, e.g. Express Edition supports 1 CPU (up to 4 cores) but Standard Edition can take advantage of 4 CPU (up to 16 cores). We don’t have to worry about if our DBMS can grow with our business.

Another advantage of SQL Server is its recent cloud deployment, i.e. SQL Azure. SQL Azure is a cloud database service provided and maintained by Microsoft. High availability, scalability and fault tolerance are all built-in and it has a 99.9% monthly SLA. We let the experts (who can be more expert than Microsoft when it comes to SQL Server) to manage our database platform. SQL Azure has no upfront cost; we don’t need to buy any new hardware nor software license. We just pay a monthly subscription fee that is based on our usage:

Database Size

Database Size Price Per Database Per Month
0 to 100 MB Flat $4.995
Greater than 100 MB to 1 GB Flat $9.99
Greater than 1 GB to 10 GB $9.99 for first GB, $3.996 for each additional GB
Greater than 10 GB to 50 GB $45.954 for first 10 GB, $1.998 for each additional GB
Greater than 50 GB to 150 GB $125.874 for first 50 GB, $0.999 for each additional GB

Data transfers measured in GB (transmissions to and from the Windows Azure datacenter)

All inbound data transfers, i.e. data going into the Window Azure platform datacenters, are free. Price for outbound data transfers, i.e. data going out of the Windows Azure platform datacenters, is shown below:

  • North America and Europe regions: $0.12
  • Asia Pacific Region: $0.19

With as little as a McDonald lunch per month, we can get an enterprise grade DBMS platform and supports. But if we think we can handle our own data backup, disaster recovery, we can stay with the free Express Edition which supports up to 10 GB of storage. Everything sounds great but let’s ask one fundamental question; is it easy to work with SQL Server within Excel? After all, we still want to use Excel to do our analysis. I have couple blogs about importing data into Excel, “Import Data Into Excel – Data Connection” and “Import Data Into Excel – VBA “. You will find that importing data into Excel is as easily as from Microsoft Office Access.

SQL Server can also help us to:

  • Simplify Development Efforts –friendly GUI, fully integrated with Visual Studio, and Performance tuning
  • Develop Sophisticated Applications – apart from desktop application, it is also ideal to support web applications and smartphone apps which require high scalability and reliability.
  • Deliver Better and Faster Information – empower users to access the information they need through various advanced business analytic tools, e.g. Reporting Services, Analysis Services, Data Mining….

Instead of answering why I prefer to use SQL Server, I always ask why we want to use a desktop database platform when we can have an enterprise DBMS. Why do you like to use Microsoft Office Access?

How about  rapid application development (RAD) tool that Microsoft Office Access provides?  It is also a great development tool and I will discuss if there is other better alternatives in another blog.

PowerPivot – Personal BI on your desktop

What do you think about business intelligence (BI)? A lot of people believe BI is only for large enterprise because it is expensive, takes long time to implement and requires special trained consultants. The worst part is that it does not often deliver what it promises to do – real time, self-service information delivery. Even there is BI within the organization, many analysts still heavily rely on their most trusted software, Excel to do their analysis because their corporate BI just does not provide everything they want or they are not familiar with the new BI environments. We want a BI system that we can manage and ready to use tomorrow.

PowerPivot may be the solution that we have been looking for; it is a free Excel Add-in from Microsoft and if you know pivot table then we can start using it right away. And it is much more powerful than Excel. We can download it from http://www.microsoft.com/en-us/download/details.aspx?id=29074. Once we install PowerPivot, we can download an example workbook from Microsoft, http://www.microsoft.com/en-us/download/details.aspx?id=28572. There are 7 tabs in the workbook. Let’s examine the first one; SalesByCountry which you will find a very simple pivot table.

You can also see a slicer that you can use to select sale channels:

And a PowerPivot Field List (it is very similar to pivot table field list):

When I showed this workbook to my colleagues, they often said “Hold on here! Is this just a regular Excel Pivot Table?” Well, this is the beauty of PowerPivot; it is an extension of Excel Pivot Table. If you know Pivot Table, then you can start using PowerPivot. The major difference is the data behind it. You can analyze much, much more data with PowerPivot. How much more? There are 2 fact tables and 10 dimension tables in this workbook. FactSales has 3.4 million records and FactInvesntory has just over 8 million records. MILLIONS! PowerPivot can handle millions of records; this is the beauty of PowerPivot. One of the news features in PowerPivot 2012 is Diagram View which allows us to easily add and change relationships of the tables. I like this new feature because BI often includes hundred or even thousands of tables; it is great to visualize them.

You can see the relationship between the tables if you click on the arrow line.

 

i.e. FactSales and DimStore are connected by StoreKey. Let’s go back to our first PivotTable; it is a very simple PivotTable which only has 4 elements.

  • Slicers Vertical – ChannelName (DimChannel)
  • Column Labels – CalendarYear (DimDate)
  • Row Labels – RegionCountryName (DimGeography)
  • Values – Sales (FactSales)

If you right click on the first 3 elements, you will see their Field Settings:

 

 

The last one, Sales has its own Edit Measure dialogue because it is a Measure Field:

 

 

It has a very simple formula: =SUM(FactSales[SalesAmount]), i.e. sum the SalesAmount from FactSales. There is a fundamental difference between PowerPivot and Excel; the basic element in PowerPivot is column so all the formulas in PowerPivot are against columns rather than cells. Let’s review the data behind this report by clicking PowerPivot Window button.

 

 

PowerPivot for Excel will be displayed and we can see all the data behind all these reports. DimDate has 2,556 records and many data columns that describe the date information.

 

 

FactSales has over 3.4 million records; it has many data columns and other important information. Have you noticed the first half of the screen has all the data columns but there is something else at the bottom:

  • Sales
  • StoreSales
  • PctAllRegions
  • YOYGrowth
  • And more….

What are these rows?

 

 

If we go back to our first report SalesByCountry and look at the PowerPivot Field List under FactSales, you would notice some of the fields have a calculator icon next to them. These are the measure fields; each has its own formula.

 

  • Sales:=SUM(FactSales[SalesAmount])
  • StoreSales:=CALCULATE([Sales],DimChannel[ChannelName]=”Store”)
  • PctAllRegions:=[Sales]/CALCULATE([Sales] ,ALL(DimGeography))
  • YOYGrowth:=[Sales] – [Sales](PREVIOUSYEAR(DimDate[Datekey]))
  • PrevYearSales:=IF( COUNTROWS(VALUES(DimDate[CalendarYear]))=1 , [Sales] (PARALLELPERIOD(DimDate[Datekey],-12,MONTH)), BLANK() )

These are not Excel formula; they are something new that we need to learn if we want to maximize PowerPivot’s potential. They are called Data Analysis Expressions (DAX) Language which I will discuss it in near future. But even without DAX, PowerPivot allows us to handle analysis on data volume that is impossible on Excel before and it is much faster! If you would like to find out more about PowerPivot, please stay tuned, I am going to discuss more about DAX, how to import data into PowerPivot. PowerPivot offers us the opportunity to have our own personal BI on our desktop. It has been available since the release of Excel 2010 and recently Microsoft has a new version, PowerPivot 2012 which includes many great new features that will make our analysis better and faster!

Beyond Excel – MATLAB

Many financial analysts develop models using Excel; it is easy, flexible and powerful. However, as time goes by, some of these models become sophisticated, unstable or take forever to run. These models have reached Excel’s limits and it may be time to migrate the models to a more flexible and powerful tool, i.e. MATLAB.

What is MATLAB?

Just like Excel, MATLAB has its own financial functions, development environment, and high level programming language but you will soon find out MATLAB is much more advanced; it allow us to develop more challenging financial models in a more effective manner.

Financial Libraries

Excel 2010 only has about 50 simple financial functions but MATLAB provides a huge set of financial libraries:

  • Financial Toolbox – estimate risk, analyze interest rate levels, price equity and interest rate derivatives, and measure investment performance
  • Financial Derivatives Toolbox – analyze and model equity and fixed-income derivatives and securities contingent on interest rates
  • Fixed-Income Toolbox – calculate price, rates, and sensitivities for interest rate swaps, credit default swaps, bond futures, and convertible bonds
  • Econometric Toolbox – provides Monte Carlo methods for simulating systems of linear and nonlinear stochastic differential equations.

These MATLAB toolboxes can handle very demanding financial models, e.g. asset allocation, cash flow analysis, portfolio optimization, credit default swap valuation, or using the GARCH, Heath-Jarrow-Morton, Black-Derman-Toy, or Black-Karasinski models to price assets or measure interest rate sensitivity. This is just the tip of iceberg.

However, the most important feature is that MATLAB gives us the source code so we can customize it according to our own requirements. Instead of building the financial models from scratch, we can jump start using existing MATALB functions. How quick we can build a model on Excel to calculate option-adjusted spread? Days? Weeks?

High Performance Computing (HPC)

Have you tried to run stochastic models on Excel? It would take you days, weeks or even months to process; this is why Microsoft come up with HPC Services for Excel. However, it requires expensive server hardware / software and special cluster safe functions.

MATLAB’s answer to HPC is much simpler and more powerful. it has 2 different toolboxes to handle the most demanding models:

We don’t need expensive server infrastructure; I managed to increase my model’s performance by nearly 350% on my Dell quad core desktop and over 6 times with a $300 video card. If I need significant more power, I can upload my model to Amazon Elastic Compute Cloud (Amazon EC2) – Cluster GPU. It is much easier to scale up the performance in MATLAB.

And even with single core, MATLAB can process much faster than VBA because MATLAB supports native matrix manipulation.

And more…

Apart from financial libraries, MATLAB also has other advanced mathematical and statistical packages. It supports object oriented programming language and we can use its profiler to identify where the bottleneck is. We can deploy our models to other environments, e.g. Java, .NET and even Excel. It is also very easy to use.

The bottom line is if your financial models have reached Excel’s limit and you are looking for something beyond Excel, then you would find MATLAB a very good alternative!

Trial and Error? – Cash flow matching

When I was a kid, I spent a whole summer trying to solve magic squares. The first one (3 x 3) was easy; it only took me couple hours.

clip_image001

The next one (4 x 4) was much harder; it took me the whole week to solve it.

clip_image002

I was never able to solve the 5 x 5 one in the whole summer until I learned the trick.

clip_image003

I guess you know what I did the whole summer; I just kept doing trial and error!

I still often solve complex problems that involve many conflicting reactions by trial and error, e.g., business planning, resources allocation. However, the trial and error approach can be time consuming, and we cannot often guarantee if we have reached an optimized solution. Fortunately, there is a lot of commercial optimization software available.

Today, I would like to demonstrate how to use Excel Solver, a free optimization add-in that virtually every analyst has when Excel is installed on their PC.

I am going to solve a very simple financial problem – cash flow matching. Let’s say we have a series of fixed cash flow payments that we are committed to pay for the next 25 years. To cover the liability stream,we have a pool of 200 qualified assets that we can invest to generate the target cash flow. You may wonder why 200; I will explain it later on.

clip_image005

I spent quite a bit of time on trial and error an optimized solution, i.e., the projected cash flow must be greater than the target cash flow. However, there are just too many assets (200) and each of them has its own projected cash flow, maturity date, coupon payment, expenses, etc. It is mission impossible to identify such a solution by trial and error.

Below is one of the scenarios that I tried. You can see how difficult it is to calibrate the model.

clip_image007

Let’s see what Excel Solver can do. The objective is to minimize the market value, and the projected cash flow must be always greater than the target cash flow.

In just over a minute, Excel Solver has identified an optimized portfolio that satisfied the above objective and constraints. You can see from the following graph that projected cash flow in most years is very close and always above target cash flow.

clip_image009

In the real world, there can be thousand or even millions of variables and many more constraints, e.g., asset types, credit rating, currency, industries, … processing time. Excel Solver is easy to use, but it is not powerful enough, i.e., it can only handle 200 variables. But don’t worry, there is a vast array of optimization software that is more powerful and smarter.

But even with Excel Solver, a free add-in that comes with Excel, I am able to demonstrate how we can easily solve problems that we would never be able to solve by trial and error.

Can you can do trial and error faster than a computer?

Web Service 1-2-3

Have you ever done a software inventory in your organization?  You will be surprised to find out how many calculation engines or financial models that your organization has.  And if you do a sample audit, you will realize that a lot of business logic are very similar within these engines / models.  It would be great if we can have a central calculation engine that is shared by all our systems / model.  We will have consistent results, reduced risks and effectively increase our model efficiency.

So what are the problems? One of the most common challenges is that our systems, i.e. pricing, valuation, asset liability management are in an heterogeneous environment; i.e. they can be in mainframe, UNIX, Windows, web or even tablet.  They just don’t talk to each others!  Fortunately, a lot of IT leaders recognized the problem and developed an interoperable machine to machine interaction protocol that allows systems to interact in a standard prescribed by W3C (World Wide Web Consortium).  This standard, Web Services was originally proposed by Microsoft in 1999 and quickly gained supports from IBM and other major software vendors, e.g. Oracle, HP and Sun.  During the last twelve years, web service has evolved significantly and become the backbone of service-oriented architecture.  Can we use Web Service to develop a central calculation library that can be used in Excel, desktop, server, web and even smartphone / tablet?  Let me demonstrate how easy Web Service can be implemented, as simple as 1-2-3.

  1. Install Visual Studio 2010
  2. Download and install WCF REST Service Template 40 (http://visualstudiogallery.msdn.microsoft.com/fbc7e5c1-a0d2-41bd-9d7b-e54c845394cd)
  3. Create a new WCF REST Service project in Visual Studio 2010

In the New Project Dialogue, we select Online Templates, then select WCF.

image

You will see the WCF REST Service Template 40 that we just installed; select it and give a name to the project, then click OK button.  WCF REST Service Template 40 generated all necessary program files to run Web Service for us.  If you build the project and run it, you will get the following web page.

image

Service1 is the name of Web Service.  Let’s create an Add function for Service1.

Add the following lines to Services1,cs:

[WebGet(UriTemplate = "Add?a={a}&b={b}")]
public Int32 Add(Int32 a, Int32 b)
{
    return a + b;
}

After we rebuild / rerun the project, then we can enter http://localhost:40501/Service1/Add?a=1&b=2.  A web page will be displayed with the following content:

<int xmlns="http://schemas.microsoft.com/2003/10/Serialization/">3</int>

This example is very simple but you can see it is also very simple to get Web Service up and running.  If you want to find out more about Web Service, please stay tuned.  I would show you how to call more sophisticated calculation within Web Service and call Web Service from other applications, e.g. Excel, desktop application and even smartphone.

 

Can your reports dance?

We create a lot of financial reports and at the same time we probably receive even more financial reports. Most financial reports are just full of numbers like the following one; and we have to look very hard for the information that we want, e.g. if the following is a control report and we should be alerted if certain ratio is bigger than 0.9.

clip_image002

Excel 2010 offers Conditional Formatting which allows us to add additional business context around our regular reports. Let’s try to add some color icons next to the boring numbers.

clip_image003

We now have a report that actually highlights items that are critical to our business.clip_image005

Let’s look at another example, a report that we want to compares with 12 months average.

clip_image006

The report contains the same numbers but now we are looking from another angle.

clip_image008

Some may want to see a little bit more details; so instead of just showing a simple arrow, we can use Sparkline to display the trend of the last 12 months values and even where the max/min are.

clip_image010

I can’t make my reports dance but Excel 2010 can give a new life to our regular boring financial reports. And there are many professional data visualization tools that can make information jumps out from our reports, e.g. Qlikview, Spotfire, Tableau and don’t’ forget to try out PowerPivot.

What tools are you using now?

All testing results correct!

and we are ready to deploy the application.  I hate to play devil’s advocate but system development is more than just getting the correct results.  In order to have a manageable, robust, and scalable system that have maximum life expectancy; there are many important things that we have to take care before we can deploy the system.  The first thing I want to visit is actually testing. 

“Did I just tell you that all testing results were correct?”

Software Testing

Software testing is a huge subject by itself and I won’t be able to cover everything but there are a few common issues that I would like to share:

  • Testing plan / scripts should be properly documented – you may not believe it but most people could not be able to reproduce the test that they did.
  • Every code / condition / exception must be covered – I hardly found testing that would cover more than 60% of the codes.
  • Non functional testing such as scalability, performance, and security should be carefully planned – the system may work well for single tester and a few hundred test cases.  Would it still work in production environment that have million of records and hundred concurrent users?
  • User Compatibility – you may be testing your application on Internet Explorer 9 / Office 2010 when your users still use Internet Explorer 6 / Office 2003.  Good luck when you deploy the application to your users’ computer.
  • Tools – Do you have any tool to automate testing, measure code coverage and defects tracking systems?  One reason that system is not thoroughly tested, is because we don’t have enough resources.  Appropriate testing tools would automate testing, compare results and report any defect; we can do more testing with lesser resources.

I only cover the bare minimum of testing and I haven’t covered any details about unit testing, integration testing, system testing, regression testing and many more.  Wikipedia has a good article about software testing.  You may want to go over it and see if you have done all necessary testing yet.

The rule of thumb is to find and fix problems as soon as possible!

Code Refactor

All testing results are correct, are we ready to deploy the application? Not quite yet if you want to deploy an application that is easy to extend and maintain!  Let’s review the code and see if we find some of the common code smells:

  • Copy and paste – I recently fixed a very small bug, but I have to go through nearly 100 different places.  The same piece of code exists in over 20 different place and I also found similar code in another 50+ different places.
  • Jumbo functions – have you ever read functions that have over 1,000 lines?  It tries to do too much.  By the time you go through half of it, you already forgot what it is supposed to do.  You are lost in the functions. Simple and small function is much easier to fix and extend.
  • God object is similar to Jumbo function and it violates virtually every object-oriented principle that promotes faster development.
  • Global Variables – they are difficult to trace and understand because they can be read / modified in any part of the application.  Sometime, you may not even know that you are using one.
  • Morse Code / mystery variables – A1, B1, C1, Xn, Yn, Zn and to make it worse, there is no comment.  It can take you hours to understand a simple if statement.
  • You can visit Martin Fowler’s website to get a more comprehensive list of refactorings.

The bottom-line is that code refactor would allow the application to produce the same results but run faster, easier to fix bugs and extend.

Security

It may sound obvious that any sensitive / critical data must be secured; however, a lot of business applications’ security is nothing more than illusion.  The application may require user login but the data is not secured or its security would require minimum effort to bypass.

Google “Excel Password” and you would not believe how many Excel password removal utilities are available;  most of them are even free.  How about Microsoft Access?  Why don’t you Google again?  Does your application store data on Excel?Apart from data; there may be situations that you also want to secure the code, your intellectual property.  If you develop a financial model that can price products 10 times faster and 10 times accurate; I strongly recommend that you should not deploy your model on Excel / VBA.

Let’s deploy it now?

I am going to cover software deployment in my next blog but I would like share some of my own experience in the last couple years.

  1. It was supposed to be a short project.  According to the project manager, everything was ready; I just had to migrate the application to production environment.  My first problem was that I could not even compile the programs successfully.
  2. Again, another simple deployment job.  The challenge was the job took over 10 hours when the users only had 1 hour window.
  3. We ran a stress test on a server that had only 2 cores and 4 GB RAM; it could support more than 200 concurrent users within 2 seconds.  We ordered 3 servers, each has 8 cores and 32 GB RAM; the web server crashed after 20 concurrent users.
  4. We tested the application on development server and integration server; our users tested it on users acceptance test server.  Everything was fine.  When the application finally deployed to production server, nothing worked.
  5. Our power users developed a sophisticated financial model that was going to distributed to their business partners.  It is ready to deploy; they just want to secure the model, so that their business partners would not be able to trace the logic and data.  It was developed on Excel.

Deployment is very important and no matter how well the application is developed and tested.  A poor deployment can still produce terrible users experience.

Are you developing a system? or just programming?

It is easy to do programming today; there are so many powerful development tools that are developer-friendly, e.g. Excel / Access VBA, Visual Studio, MATLAB, R…etc.  A lot of business users who are analytical, logical smart, choose to develop their own business application.  These systems can be agile, cost effective and deliver what they want!

However, many business users have handed over these hidden business applications back to IT department in a few years.  These lovely applications have grown into monsters that they can’t control; it takes too long to process, whenever they want to add a small enhancement, something unexpected fail or certain disasters happen, e.g. the business user who created the application left the company.

What are their problems?  Why did these business applications have such short life expectancy?  How can we manage our applications?

Management

Interactive Dashboards using Excel?

When we need to prepare interactive dashboards or other data driven chart; most of us would think about Excel.  At least, I do until I read the article from Glenna Shaw, a Most Valued Professional (MVP) for PowerPoint.  In her article, she showed us how to use PowerPoint to build an interactive dashboard.

Below is an example that Glenna Shaw shared with us and you can download it from her SkyDrive.

image

Andrew Chan helps you to make better and faster decisions