When is your Information Technology investment in general and Service related investment in particular is CAPEX and not an OPEX? Let’s consider an example, if service is a value-add then money spent on it will be CAPEX, otherwise if it is something you have to do to keep your product alive – it’s definitely OPEX.
From architecture point, CAPEX services can be developed in an isolated environment and tied back to the product via satellite interface. Many such services are offered via browser and charged on pay-per-use billing model. On the other hand, core business services like messaging and security are integral part of the product and are tightly integrated with it. Mostly, customer will end up paying for OPEX service in one way or other.
Recently, my cell phone provider offered me additional data storage for fixed cost per month. This is a good example of a CAPEX service. Same provider is charging me fixed price for each text message [since I don’t have text plan] – I guess, for the provider, this is an OPEX service.
There are many things we consider while evaluating any technology including distribution model. With SOA, this distribution channel and CAPEX / OPEX model will decide many things like coupling architecture, SLA and revenue projections…
**I am not an accountant and do not understand the nuances involved in this CAPEX Vs OPEX discussion. So feel free to correct me …
Happy Holidays!
Ref:
http://www.stratetect.com/2008/06/capex-vs-opex-what-is-the-difference/
Sunday, December 27, 2009
SOA: CAPEX Vs OPEX
Tuesday, November 3, 2009
Sunday, October 18, 2009
Friday, October 2, 2009
Healthcare Demo App on SQL Azure Platform
In our Azure user group meeting and at the LA CloudCamp, I heard many people asking about support for entity framework on top of SQL Azure platform. The best way to test / ans / find out is to build an application.
So I have developed a healthcare app [provider claim lookup] on top of SQL Azure. This app is based on "Code Far" model -i.e. app is running of the hosted server but data is stored in the cloud.
Developed using : Dynamic Data Entities Web Application, Entity Framework, SQL Azure, ASP.NET / C#
Here is a link http://www.zimbatech.com/healthcaredemo/
abhi
www.zimbatech.com
Thursday, October 1, 2009
LA CloudCamp -Data in the Cloud
We had a very good CloudCamp in LA. It was my first unconference – I was not sure what to expect and what I will learn. But at the end of the day [around 11.45 PM or so], when I was walking up to my underground garage on the corner of 5th and Olive, I felt really good about it. It was on the fly brainstorming session with 100 plus technology related folks.
The way this camp was conducted is impressive and food [please read it as free food] was awesome. They had unlimited beer for those who would like to climb up in the cloud before getting into any serious conversation…
We focused on the general theme of data in the cloud. I started the conversation with CAP [not CRAP]. C: Consistency, A: Availability and P: partionshing [a.k.a. load balancing, scaling etc.]. It is mathematically proven that one can have any two of these three qualities in any massively distributed system. Now if we are building an app for the cloud – this principal is very important as there is a network in the middle… it was a discussion that could almost go on for days if not weeks, and we had to wrap it up in half an hour. So the conclusion was twofold – know the limitations and choose what you want before you build your app – you can have any combination of CAP – viz – CA, AP or CP but not all…
Then Lynn Langit of the Microsoft presented on SQL Azure. It was a very good, house full presentation. Our small room was jam packed to its limit and Lynn did a great job of introducing this RDBMS in the cloud. Yes, we heard all those legitimate concerns one more time – how can you build a real database with 10 GB, and what about replication? But as Lynn said this is V1 [and we know from our experience that MS gets the right product out with V3]. I learned one important lesson in this session – when there are non-microsofties around – explain every acronym you use. For them PDC and RTM is like JAOO to the Microsoft community.
Our final talk was on scaling the data in the cloud. From what I understood [which might be way off the mark], there is altogether other alternative to the RDBMS model. Things like Tokyo Cabinet and hadoop , HBASE, CouchDB, etc. [WOW – I remembered all these things]. The point of this discussion was, start thinking outside the box. There are other ways to think about transactions – like BASE, and ACID is not the only way to achieve the consistency. This session was more techno- philosophical. Take away, as per our DBA friend –“RDBMS is crap, start thinking about alternatives”. In the end, the data structures you want to use depend on the type of applications you want to build. Facebook and banking app are two extreme end points on this scale, and have their unique requirements. Albeit, both of them deal with large datasets…
In the end, walking back to the parking lot, I heard this interesting comment –“I went to watch a movie and they told me to act in it, there was no Tom Cruise or Don Box, Lary Ellison in the room, and I ended up presenting the show. Oh! Well – thank god my wife was not in the room…please pardon me if I said something stupid, you know I was little bit drunk…”
abhi
www,zimbatech.com
Saturday, September 26, 2009
SQL Azure Object Browser
Yesterday, I did a blog post on how to alter a stored procedure. On Microsoft Azure Forum, Gaurav has suggested two very good options. Both of them are a scale down version of SSMS - but are way better than the existing support of SSMS for SQL Azure - they provide decent object browser and you don’t have to cancel the login first time you connect to the SQL Azure database...
1. Cerebrata supports Windows Azure Platform. An online demo can be found on
http://www.cerebrata.com/Blog/post/Browser-based-SQL-Azure-Explorer...
I am impressed with the functionality offered here
2. SQL Azure Manager by http://hanssens.org/post/SQL-Azure-Manager.aspx
This is a ClickOnce app and only supports table and views but you can run your TSQL here. One can use a combination of these two apps to simulate almost 75% of the SSMS support and is good enough for most of the routine stuff.
abhi
www.zimbatech.com
Friday, September 25, 2009
Alter Stored Procedure in SQL AZURE
Absence of object browser in SQL Azure throws many challenges. For example, how would you update a stored procedure in your SQL Azure database? The challenge is to get the source code from the cloud database.
select * from sys.objects
will only display the objects in your database and not the source code for any stored procedure or view etc.
Here is my hack to get the source code for any stored procedure.
1. SELECT * from information_schema.Routines
Will give you the list of all stored procedures. For example you want to alter the stored procedure usp_Test. Then
SELECT routine_definition from information_schema.Routines where specific_name ='usp_Test'
Now copy the contents of routine_definition – this is the source code for usp_Test. In my case this is
CREATE PROCEDURE usp_Test AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; -- Insert statements for procedure here select count(*) as TotalVisits1 from visits END
3. format this code
CREATE PROCEDURE usp_Test
AS BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
select count(*) as TotalVisits1 from visits
END
4. change CREATE PROCEDURE to ALTER PROCEDURE
ALTER PROCEDURE usp_Test
AS BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
select count(*) as TotalVisits1 from visits
END
Update the source code as required and execute the TSQL
5. This will update your stored procedure in the cloud database.
I guess, best practice would be to mantain a local copy of your database and keep it in sync with your cloud database. That way we can run the code on our local copy and just run the script against the cloud database.
Oh! Well – we are just learning and I did not have a local copy of my test database. Again, this is not an elegent solution but it does work, so if you can recommend / find any better solution – please let us know…Thanks.
Monday, September 21, 2009
SQL AZURE Service
Finally, I got the SQL Service up and running. It is running of the hosted server
MyCloud Service
While working on this app, I found some interesting things. Here is a short summary
1. select * from sys.objects IS YOUR NEW OBJECT BROWSER
2. truncate table tablename will not work in SQL Azure - yes,drop and delete does work.
3. select @@servername will not work but select @@version does work
4. DO NOT TRY exec sp_help and exec sp_who – it will not work
5.Insert WILL NOT WORK if you forget to add a primary key on your table - For example
create table test
(
my_id int,
my_name varchar(10)
)
insert into test values (1,'abhi')
and you will get an error –“ Heaps can not be replicated tables. Please create a clustered index for the table.” . Just add a PK on my_id and things will work as expected
6. Migrating data from your local DB to the cloud is not easy - check out >http://www.stephenforte.net
7. Copy connection string from your >https://sql.azure.com/ServerInfo.aspx page - this is the easiest and fastest way to connect your app with SQL Azure
Also,I will be presenting on SQL Azure in the upcoming SoCal Code Camp 2009 @ USC
Thursday, September 17, 2009
Wednesday, September 16, 2009
Gone with the wind…
Recently, I drove almost 85 miles for an interview. When I entered into the office, I was greeted by a very enthusiastic developer. And then came the moment of truth – he walked me into a room and told me to take a hands-on programming test. OK. The test was on VB6 programming. Well, what happened next is a very funny story, but it kept me nagging that I haven’t thought about VB environment for long time…with all this .NET code around, I just forgot about my old pal VB. Yes – it’s been almost a decade for me when I wrote any code in VB. And then while talking to somebody on COM/DCOM horror stories – I was told that the official support for VB development environment is gone. I found following post on VB resource center
“The Visual Basic 6.0 IDE will be supported on Windows Vista and Windows Server 2008 as part of the Visual Basic 6.0 Extended Support policy until April 8, 2008… ”
From http://msdn.microsoft.com/en-us/vbrun/ms788708.aspx
Yes – VB runtime will be supported till 2019, but support for Visual Studio 6 is gone. Well, everyone using VB 6 is aware of this fact, but for me, on personnel note – it is a weird feeling...
"I have forgot much, Cynara! gone with the wind."
Friday, August 21, 2009
Graph of diminishing questions
How do you feel if somebody asks too many questions?
Well, whenever we have a new team member joining our group – we all go through the same experience. I call it - graph of diminishing questions [as shown below]
We see one of the following reactions or some combination of it
1. why s/he is asking so many questions - come-on it’s a simple business application - and you are hired to work on it - go figure it out - we gave him / her access to all the systems, code, database - what more can we do - I mean put some break points - run some queries - go figure it out on your own - that’s why you are hired - don’t ask me all these stupid questions?
2. I am not sure why s/he is not asking any questions. If you don’t understand then just ask us. But please don’t say yes when you don’t get it. You know - just yesterday, I was at his/her desk, and by looking at the screen, I can tell you s/he is lost, I don’t think s/he knows anything about the system - and I am not sure why s/he is not asking any questions?
3. Well s/he is a smart person but I think – I don’t know - s/he might have over promised in the interview and now struggling - I think s/he is not that smart as was told to us [by somebody – may be our boss…]
4. We don’t do too much training here - we throw the baby in the water and if she can swim - she can stay
5. We do work with new people and explain them the system and also help them understand the real challenge
Thursday, June 11, 2009
How many CPUs your operating system can handle?
How many CPUs your operating system can handle? 2 / 4 / 8 – many a times ans is very simple – I don’t know? How to find the answer? Here is one easy way to get to the bottom of this issue. Go to the Task Manager and right click on any process , and then click on the set affinity option as shown below. You will get the answer. Note: this is the number – how many CPUs can be managed by your OS and not the underlying hardware.
And here are the specs for a machine I used in this excercise
Sunday, May 3, 2009
Multiplexing strategy for reducing software licensing cost
Software licensing is a complex and confusing topic. Most of us know how to use and master any given software like SQL Server, Oracle, or Crystal Report etc. But purchasing a software and figuring out the real cost is a challenge. Questions like do you want to go with named-users or per core licensing or on the basis of number of processors or total number of CPU’s are confusing. For example, Microsoft will charge you based on the number of processors in your server irrespective of number of CPUs / core per processor and Oracle will treat each core as half the processor. For example, on 4 quad core server [4 processors with 4 CPUs on each processor = 16 CPUs or cores], SQL server will cost you for 4 licenses and Oracle will result in 8 licensees [16 * 0.5 =8] [1]
Now this might be acceptable for enterprise grade tools like RDBMS or reporting solution. But many a time we end up buying 8 or 10 copies of small commodity software used by a department or a group of users working on a special project. Most of these tools are sold on per instance basis. So we end up paying for 10 copies and installing it on individual desktops. This also results in maintenance nightmare.
There is a better way to deal with such commodity tools – strategy known as multiplexing. It’s very straight forward – all you have to do is to install one instance of the commodity software on your central server and write a wrapper around it. The wrapper can be exposed over the network [LAN / WAN / Internet] using http, https etc. Yes, you will have to write a web application that will expose the interface for the commodity tool. Many a times, this strategy will work like a charm [except if licensing agreement for the commodity tool explicitly deny the permission for any such exercise]. Note – this does not involve changing any of the bits / code in the commodity software – all you are doing is to expose its public interface over the network. Threading and deadlocks might pose some challenge when shared resources are simultaneously utilized by many users. But this can be avoided by placing an explicit lock on shared resources like file folders. Additionally, this public interface can be modified to accept the inputs from internal resources like local databases, file system etc.
If used judiciously, multiplexing can deliver a great business value.
Reference:
1.http://www.microsoft.com/Sqlserver/2005/en/us/special-considerations.aspx
Monday, April 20, 2009
Web 2.0 and Cloud Computing
It is always good to compare different technologies. That way we can see whether the technology is maturing with time and is worth considering it for real. But the challenge is to find a tool that would help in this endeavor. I found “Gartner Hype Cycle” [GHC] really effective in comparing different technologies and their evolution.
So I wanted to see how web 2.0 is doing over last three years and where does cloud computing stand on this technology curve. I always believed that image is worth thousand blogs…so here you go – GHC for 2006, 07 and 2008 and the evolution of web 2.0 vs. cloud computing. ..
Reference
for 2006
http://www.saastream.com/my_weblog/images/2007/10/02/gartner_hype_cycle_emerging_techn_4.jpg
for 2007
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKRtg86I2SFDcA_IyZK3lG8u5d7KOpX0eEQAaPqmLqlT7k9wLr61vlCbEYmruMdU__jyGJBIBN-scyV7gkIi21CZGD91Bzhlu2XlQ1LKZV58qpTUe39xP7PpJLC1RNEuayK-Latvy45mTF/s400/hype-cycle-for-emerging-technologies-2008-400.jpg
for 2008
http://www.flickr.com/photos/lafabriquedeblogs/2796357787/
Sunday, April 19, 2009
Virtualization : as experienced from ground zero
Virtualization is a big theme in the contemporary enterprise architecture. Many documented case studies highlights the virtualization benefits including better hardware utilization, grater ROI and lower TCO. This deal is so great that you can’t overlook this option…
But, and this but is a big one. But the reality is not that green. From software point of view, it is not able to utilize 100% of the physical infrastructure. Like in our example, we have 8 CPUs and 128 GB of RAM, but each virtual machine can only use up to 4 CPUs and 64 GB of RAM. Systems like RDBMS thrive in many-core / multi-processor environment. So by limiting the number of CPUs or memory, we are minimizing its capacity. This is what we experienced when we decided to use a hypervisor [and this is mostly true for any hypervisor].
In future version of hypervisors, this limitation will be fixed. Based on the processing load, each VM will be able to optimize the physical resource utilization. But as of today, there are some hard limits on no of CPUs and RAM each VM can use.
So to conclude, in our test case – let’s compare machine A with 8 CPUs and 16 GB of RAM with machine B with 8 CPUs and 128 GB of RAM. Machine B is virtualized and is used to run 2 VMs with 4 CPUs each and 64 GB of RAM. Both the VM’s on machine B are operating in parallel and are used for running an instance of SQL Server 2005 and a file server. Which SQL Server instance will run fast – one that is running on machine A or on machine B? What if we bump up the RAM in machine B up to 256 GB? These are some solid question one should consider seriously before jumping on the virtualization bandwagon. Yes, all these issues will be fixed in the future versions and all vendors are working on it. But from pure architecture point of view, as of today, one should consider these limitations seriously.
A lot will depend on the processing load and type of applications running on these virtual machines. For busts and spikes – you need a ton load of RAM and huge processing for a while but after that all that capacity is just idling around. What if that bust or spike is the reason for provisioning such a huge hardware capacity? Can we run this busty system in a virtualized environment and let it suffer because of VM’s limit on no of CPU it can handle and maximize our hardware utilization or go the traditional way and provision a big machine just to handle this one mission critical load efficiently and effectively? These are some critical questions one would face while moving the production applications in a virtualized environment. There are no easy answers and the only way to insure the success is by doing as much testing as possible.
Sunday, February 1, 2009
On the Benefits of Learning Multiple Languages
Finally, I got to the pile of magazines I wanted to read for long time. I found this article in Visual Studio magazine. I liked the title and started reading it. The basic premise is – “Learning other programming languages can make you more fluent and give you a better understanding of the language you depend on primarily.”
So I started coding in the language I know the best – C# and wrote some code
using System;
namespace demo
{
internal class Test
{
public static void Main()
{
const int i = 2;
//static method call
Console.WriteLine(IsEven(i));
//generic delegate
Func<int, bool> f = IsEven;
Console.WriteLine(f(2));
//anonmous method + lambda
Func<int, bool> f1 = j => j%2 == 0;
Console.WriteLine(f1(2));
Console.ReadLine();
}
public static bool IsEven(int number)
{
return number > 0 ? number%2 == 0 : false;
}
}
}
Now, I wanted to convert this code to visual basic. Interestingly, I found a decent conversion method to get this thing done. I used .NET reflector. I used reflector to disassemble C# .exe and then to convert the code to VB. Reflector did a good job and produced following VB code
Friend Class Test
' Methods
Public Shared Function IsEven(ByVal number As Integer) As Boolean
Return IIf((number > 0), ((number Mod 2) = 0), False)
End Function
Public Shared Sub Main()
Console.WriteLine(Test.IsEven(2))
Dim f As Func(Of Integer, Boolean) = New Func(Of Integer, Boolean)(AddressOf Test.IsEven)
Console.WriteLine(f.Invoke(2))
Dim f1 As Func(Of Integer, Boolean) = j => ((j Mod 2) = 0)
Console.WriteLine(f1.Invoke(2))
Console.ReadLine
End Sub
End Class
The only point it tripped is on the lambda expression syntax =>. So, I had to do a little bit of Googling to find the correct VB syntax. And, finally got the VB code like
Friend Class Test
' Methods
Public Shared Function IsEven(ByVal number As Integer) As Boolean
Return IIf((number > 0), ((number Mod 2) = 0), False)
End Function
Public Shared Sub Main()
Console.WriteLine(IsEven(2))
Dim f As Func(Of Integer, Boolean) = New Func(Of Integer, Boolean)(AddressOf IsEven)
Console.WriteLine(f.Invoke(2))
Dim f1 As Func(Of Integer, Boolean) = Function(j) j Mod 2 = 0
Console.WriteLine(f1.Invoke(3))
Console.ReadLine()
End Sub
End Class
I enjoyed this exercise. In fact, in the same edition of the VS magazine, I found couple of interesting articles - What VB Devs Should Know About C# and What C# Devs Should Know About VB.
If you are involved in any migration effort from VB to C# or vice versa, or work in the mix environment with a lot of VB and C# code co-existing peacefully , then you will find these articles really useful.