Authenticating with the Zoho CRM API

Recently, I was integrating an ASP.NET application with Zoho CRM, and found the API to be a little awkward.  The examples for C# are also incomplete, and I wanted to outline some of the steps to get up and running.

The first thing you need to do before being able to make any requests is generate an authentication token. There’s no way to this from the web interface, so the API must be used, though once you’ve done this once, you can save the token and will not have to generate a new one.

Fortunately, the process is straightforward and well documented.  Simply make a GET request to the following URL (you can even do this using your browser):[username]&PASSWORD=[password]

This will return a value like this:

#Mon Sep 08 04:39:37 PDT 2014

Copy the authtoken and save this somewhere as you’ll need to use it for any other API requests.  This only needs to be generated once and does not expire.

I’ll cover retrieving, adding and updating records for standard and custom modules in future posts.

Getting started with Salesforce and C#

I’ve been working on integrating an existing web application with Salesforce, and it was a little more work than I expected.  Some of this was down to unfamiliarity with the platform, which I’ve never used before today.  Here are some of the key steps involved and things to note:

1. The cheapest plan that offers API access is the enterprise plan.  At $125 /month /user, this seems a bit on the expensive side for small businesses, but I digress… In any case, if you’re using a different (cheaper) plan, then the API is a paid extra and not included as standard.

2. Creating a Salesforce developer account is free, and is the easiest way to start using the API.

3. Add your IP address to your account or your API access won’t work.  You can do this by logging in, and then doing the following: Click on Setup in the top right corner next to your name, expand the Security Controls section on the left hand side (in the “Administer” area), click on Network Access, then New. Enter your public IP address in the Start IP Address and End IP Address boxes. Click Save.

4. Next you need to get the URL to the WSDL file, which we’ll add to our project in Visual Studio to generate the web service proxy.  To get this, expand the Develop section on the left hand side (under the “Build” area). Then click on API. Right click on the Generate Enterprise WSDL link and copy it (it’ll be something like*).

5. Once that’s done, create a new project in Visual Studio.  For this example, let’s use a C# Console Application.  I’m assuming use of VS2010+ here (the process of adding a web reference is slightly different but easier in prior versions).

6. First we’ll need to add a reference to our WSDL from Salesforce to generate the webservice proxy. Right click on References in the Solution Explorer and select Add Service Reference. Click Advanced (bottom left on dialog box). Then Add Web Reference.   Pase the URL from step 4 above into the URL box and click the green arrow.  You’ll probably be prompted to login to Salesforce. Once that’s done, you should see the WSDL or service method summaries in the window.  On the right hand side, under “Web Reference Name”, type salesforce and click Add Reference.

7. Your project now has a reference to Salesforce, and in the solution explorer, you should see a new node titled Web References with salesforce under it, which we just created.

8. Paste the code below into your main method.  Note, you will need to change the username and password placeholders below to your details (it appears Salesforce doesn’t offer an API key option).

private static void Main(string[] args)
    Console.WriteLine("First Name?");
    var firstName = Console.ReadLine();

    Console.WriteLine("Last Name?");
    var lastName = Console.ReadLine();

    var contact = new Contact();
    contact.FirstName = firstName;
    contact.LastName = lastName;
    contact.Email = string.Concat(firstName, ".", lastName, "");

    using (SforceService sfs = new SforceService())
        var login = sfs.login("", "your-password");

        if (login.passwordExpired)
            Console.WriteLine("Salesforce password expired!");
            sfs.Url = login.serverUrl;
            sfs.SessionHeaderValue = new SessionHeader();
            sfs.SessionHeaderValue.sessionId = login.sessionId;
            var userinfo = login.userInfo;

            Console.WriteLine("Logged in as {0}", userinfo.userFullName);

            var sr = sfs.create(new sObject[] { contact })[0];

            var result = sr.success;


            if (!result)
                Console.WriteLine(string.Join(", ", sr.errors.Select(x => x.message)));

            Console.WriteLine(result ? "Created" : "Not Created");


9. This code sample will prompt you to enter a first name and last name, and will create a new contact with these details.

10. To view the newly created contact, click on the “Contacts” tab in the Salesforce web interface. Choose “All Contacts” from the View dropdown and click Go. Your new contact created from the command line application should be in the list.

This is far from a definitive guide, but should hopefully help anyone new to Salesforce overcome some of the initial hurdles in getting started with the API using .NET.

Debugging macro for VS.NET

I’ve found the macro below for starting debugging a massive time-saver. Much easier than going through the ‘attach to process’ UI screens all the time.

The steps below can be used to add a button to a toolbar in VS for attaching to IIS/IISExpress.

Tools -> Macros -> Macros IDE

Right-click MyMacros -> Add -> Add Module

Name: DebuggingMacros

Paste the code below into the module.

Public Sub AttachToWebServer()
        Dim Processes As New System.Collections.Generic.List(Of String)
        Dim Attached As Boolean = False
        For Each Proc In Processes
            If (AttachToProcess(Proc)) Then
                Attached = True
            End If
        If (Not Attached) Then
            System.Windows.Forms.MessageBox.Show("Can't find web server process")
        End If
    End Sub

Right-click toolbar -> customize -> commands tab -> Toolbar radio -> Build -> Add Command

Select Macros from the categories list, Macros.MyMacros.DebuggingMacros.AttachToWebServer from the list of commands on the right.

Rename the button to ‘Attach to IIS’ (or whatever you want).

To use, simply click on the button to attach the debugger to IIS/IISExpress.

Saving username and password with TortoiseGit

Saving your login details in TortoiseGit is pretty easy.  Saves having to type in your username and password every time you do a pull or push.

1. Create a file called _netrc with the following contents:

login <login>
password <password>

2. Copy the file to C:\Users\<your-username> (or another location; this just happens to be where I’ve put it)

3. Go to command prompt, type setx home C:\Users\<your-username>

Note: if you’re using something earlier than Windows 7, the setx command may not work for you.  Use set instead and add the home environment variable to Windows using via the Advanced Settings under My Computer.

Installing Solr on Windows 7 x64

Took a few steps to get this working.

1. Download Java SDK

2. Download Tomcat 7

3. Download Solr 3.4.0

NOTE: Make sure you download 3.4.0 and not 3.5.0.

4. Install the Java SDK

5. Install Tomcat on Windows

6. Verify Tomcat works by trying to browse to http://localhost:8080/


7. Right-click on Tomcat icon in system tray, select stop service

8. Unzip to C:\apache-solr-3.4.0

9. Copy C:\apache-solr-3.4.0\example\Solr to C:\Solr

10. Copy C:\apache-solr-3.4.0\dist\apache-solr-3.4.0.war to C:\Program Files\Apache Software Foundation\Tomcat 7.0\webapps\solr.war

11. Right-click on the Tomcat icon in the system tray, choose configure, Java tab, Java options, add the following line at the end:



12. Start Tomcat by right-clicking on the system tray and selecting “Start Service”

13. Browse to http://localhost:8080/Solr/ … You should see “Welcome to Solr”.

14. Click on “Solr Admin”.  You should see the Solr admin interface.

Upgrading to iOS5

Well, seeing as the only thing I seem to have been blogging about recently is the iPad, I thought I’d mention that I decided to sacrifice my jailbreak and upgrade to iOS5.  I’ve been using it for about a week now and it’s definitely been the right decision, with the device being much more responsive and crashing far less.  It’s a little sad losing sbsettings and multifl0w, but the increased reliability makes it worhwhile.

Not quite ready to upgrade my iPhone 4, even though it’s starting to feel like Windows and needs to be rebooted or resprung at least once a day.  The other benefit of course, is that with both devices running iOS5, I’ll get the full benefit of iCloud with apps, photos, etc syncing between devices.

Upgrading iOS firmware on jailbroken iPad

Ipad_smallI’ve been holding off on upgrading the firmware on my iPad since I bought it, as I didn’t want to lose my jailbreak, or end up with a tethered jailbreak.  Although it doesn’t leave the house much, having to connect the device to a PC to reboot it sounds like too much hassle.

With the release of greenpois0n, I finally took the plunge and upgraded from 3.2 to 4.2.1.  It was actually less hassle than I expected, though largely because the iPad doesn’t have the same problems as the iPhone, where upgrading via Apple will also upgrade the baseband, which could possibly break any carrier unlock.

The whole process took a couple of hours, including downloading & installing the latest version of iTunes, upgrading the firmware from within iTunes, jailbreaking using the greenpois0n app, and then re-installing the jailbroken apps (sbsettings, multifl0w, infinidock, winterboard, aquarium hd, fake carrier)

I’m yet to do this on my iPhone, though I’m still contending whether I want to use the official Apple upgrade option and then use greenpois0n to jailbreak it, which will also upgrade the baseband and break the carrier unlock –  though I’m not using this anyway but losing it makes the phone harder to sell in future, or whether to restore a custom 4.2.1 IPSW, but go through all the aggravation of re-installing all of my apps manually.

Either way, it’s nice to finally have 4.2.1 on the iPad.  I’m a big fan of folders and organizing my apps into groups, so that’s the biggest obvious benefit for me.  I’m not using AirPlay (haven’t got an Apple TV), and AirPrint seems useful but only works with selected printers, though there seem to be some apps and hacks to work around that which needs further investigation.

Tags: ,

Installing .NET 3.5 SP1 on Windows 7

I’ve been trying to install .NET 3.5 SP1 on my Windows 7 machine, but had a strange problem where the running installer would just do nothing. Tried using both the bootstrapper and the full version, both with no luck. After some fiddling, the following steps seemed to fix the problem:

1. If installed already, uninstall the .NET framework via Control Panel – Turn Windows features on or off.

2. Reboot

3. Stop IIS – command prompt, iisreset /stop

4. Download .NET Framework 3.5 SP1

5. Run the installer

6. Reboot

The installer might display an error during step 4, but .NET Framework 3.5 SP1 should still be installed correctly.

Revisiting Backups

During December, I was out of town and away from my desktop PC, intending to connect to it using LogMeIn to keep working. Although my laptop has enough disk space and is a reasonable enough specced machine, my desktop PC is my main machine and has everything set up already, and remote access software generally works well.

Unfortunately, during this period, the Western Digital Raptor in my desktop PC with all my data decided to fail. This is my first real drive failure, and having it fail while being 4000 miles away in a different continent really magnifies the problem. The first signal was connecting to my PC and seeing two drive letters vanish into thin air. Rebooting the PC ended up in losing connection permanently (which I later learned was because it had hung on boot while detecting the drives), followed by a couple of hours troubleshooting with a friend who managed to get in front of my PC, finally accepting the fact that the problem could not be easily resolved and leaving the machine switched off until I could get back.

Fortunately, the code I needed to work with was in an offsite SVN repository and essential dev tools like Visual Studio and SQL Server are available from MSDN, allowing me to get my laptop in a state that was at least usable. This managed to allow me to keep working, but it did get me thinking about my general backup strategy though.

About a year or two ago, I tried several online backup solutions and although I preferred Mozy’s backup software, the actual backup process refused to work properly on my machine and eventually I settled for Carbonite, in tandem with Syncback SE for syncing changes to an external drive. This strategy worked fine until recently, when I upgraded to Windows 7. Unfortunately, Carbonite’s software had some issues with this, and I didn’t re-install, leaving my PC without form of online backup.

Reluctant to have no backup in place all, I decided to go for a disk imaging solution instead and eventually settled on Macrium Reflect. The pro version offers scheduled backups, and I had weekly full backups and nightly incremental backups for system and data drives. The backup images are stored on a third separate hard disk in my PC and copied over to an external 1TB drive to offer some redundancy.

Once I got back to FL, getting back up and running was mostly a case of getting a new drive and simply restoring from the most recent image created by Macrium Reflect. The whole process took about a couple of hours and was relatively painless. However, the Raptor was partitioned into two drives, one containing utilities. This wasn’t backed up as it didn’t seem like there was anything too important on there. Getting the essential ones back took a couple of hours too and a lot of time and hassle could have been saved by backing this up too.

It seems like Carbonite have fixed most of their issues with Windows 7, but there’s still a conflict with SVN where Carbonite causes the SVN icon overlays to disappear. Generally unsatisfied with Carbonite, my search for a viable online backup solution continued.

It’s been a couple of weeks since I’ve been using JungleDisk (as suggested by Shawn Wildermuth), and so far, I’m liking it. The cost structure is a bit different from Carbonite, with the desktop addition costing $3 per month plus the cost of storage with Amazon S3, which is about $0.10 per gigabyte per month, plus fees for data transfer – though this is currently free until June 30th, 2010. There’s a slightly cheaper version for backups only at $2 per month, but with the added features of folder sync and a mapped network drive in the cloud, the desktop version is easily worth the extra cost.

The software doesn’t look as fancy as Carbonite, though I actually prefer my Windows applications to look like Windows applications rather than having big fonts and bright colors, and JungleDisk definitely offers users more control over their backups. However, it does lack shell integration which would’ve been nice, and I don’t think there’s any way to restore files via the web. But the features all work well and as expected. It’s also great to be able to use the software on multiple machines, with the only additional cost being incurred for the extra storage.

The yearly fee for Carbonite is about $55. My backup is about 60gb, though the first 5gb is free, so this is going to be about $8.25 per month. Adding in the Jungle Disk fee of $3 per month, and the total cost per year is about 2.5 times the cost of Carbonite, but still not unreasonable and given the cost and pain of losing important data, it’s worth it.

Overall, my backup strategy consists of the following now:

  • Backups every 5 minutes to Amazon S3 with Jungle Disk
  • Automated weekly full backups of system and data drives using Macrium Reflect to a separate physical internal drive
  • Automated nightly incremental backups of system and data drives using Macrium Reflect to a separate physical internal drive
  • Both weekly and nightly incremental backup images are also copied to a separate external USB drive.
  • Non-automated backup of utilities drive (this hardly changes and is just to ease restoring if another drive failure happens)
  • Code backed up to offsite SVN repository

This seems like it should cover everything, from a full-blown drive failure to accidentally deleting some important data.