One of the challenges of managing builds and deployments of source on multiple developer machines is that it can be complicated to contain and manage different variations in developer environments.
For example, often times, it is useful to know the server name or the connection string information so that local settings don't make it into source control.
How I've often tackled this is to add a batch file that every developer executes when getting the source for the first time. This batch file asks for various settings and then saves the results to a text file which is then read back when executing other batch files.
Here is an example of such a batch file:
@ECHO OFF ECHO ================================================================ ECHO Creates the developer specific local configuration for builds ECHO that will allow developers to use local settings for specifc ECHO aspects of the deployment. ECHO ================================================================ :SPURL SET SPURL= SET /P SPURL=Enter SharePoint site URL (ex. http://mymachine:2345/): %=% IF "%SPURL%"=="" GOTO SPURL ECHO ---------------------------------------------------------------- :SQLCONN SET SQLCONN= ECHO Enter SQL connection string to the IC membership database ECHO (ex. Server=VM1;Database=IPDB;User ID=membershipUser;Password=P@ssw0rd;Application Name=HeCoreServices): SET /P SQLCONN= IF "%SQLCONN%"=="" GOTO SQLCONN ECHO ---------------------------------------------------------------- ECHO SPURL=%SPURL% > build-configuration.txt ECHO SQLCONN=%SQLCONN% >> build-configuration.txt ECHO Completed; created file build-configuration.txt PAUSE
This batch file will prompt the developer for two settings: the URL of a site and the connection string that the developer is using locally (which can vary by the database name, login, etc.). The contents get written to a file called build-configuration.txt that looks like this:
SPURL=http://mymachine:2345 SQLCONN=Server=VM1;Database=IPDB;User ID=membershipUser;Password=P@ssw0rd;Application Name=HeCoreServices
This file is excluded from source control and developers can, of course, manually edit this file as well to create local settings.
Now when I'm ready to use these settings in another batch file, I can invoke it like so:
@ECHO OFF SETLOCAL ENABLEDELAYEDEXPANSION FOR /F "tokens=*" %%n IN (build-configuration.txt) DO ( ECHO %%n SET %%n ) ECHO %SPURL% ECHO %SQLCONN% PAUSE
There are other ways to do this as well, but the downside to most approaches is that you have to know how many parameters you have or use less meaningful names. This approach will let you set variables to your heart's content and read them in dynamically at execution.
There are many ways to improve the performance of your CAML queries, but I've recently found that in some cases, it's as easy as switching the order of your filter operations.
In this case, I was searching across a list of 1,000,000 items for a set of 41.
The list consists of tasks with, among other fields, a Status and Assigned To field.
Both of these fields were indexed, but the following query was still running in the 10 second range:
<Where> <And> <And> <Neq> <FieldRef Name='Status' /> <Value Type='Choice'>Completed</Value> </Neq> <Neq> <FieldRef Name='Status' /> <Value Type='Choice'>Terminated</Value> </Neq> </And> <Eq> <FieldRef Name='AssignedTo' LookupId='TRUE' /> <Value Type='Int'>159</Value> </Eq> </And> </Where>
One small tweak and the same query ran in 1.5s:
<Where> <And> <Eq> <FieldRef Name='AssignedTo' LookupId='TRUE' /> <Value Type='Int'>159</Value> </Eq> <And> <Neq> <FieldRef Name='Status' /> <Value Type='Choice'>Completed</Value> </Neq> <Neq> <FieldRef Name='Status' /> <Value Type='Choice'>Terminated</Value> </Neq> </And> </And> </Where>
All that was done was to shift the order of the query conditions.
The first query reads as "All tasks that are not Completed and not Terminated and Assigned To user 159".
The second query reads as "All tasks that are Assigned To user 159 that are not Completed and not Terminated".
I didn't trace the generated SQL, but it's not hard to imagine that the SQL now performs an initial filter on the data set against the user ID and returns a much smaller data set for subsequent operations (filter by Status).
So the lesson learned is that for large lists, you need to follow Microsoft's guidance on large lists, but also ensure that your queries are written to take advantage of the indexes and reduce the data set as early as possible (preferably against an indexed field.).