How to import bulk data in Azure SQL

Background

Its a common scenario for sql server based application to facilitate import/export data from various sources. With sql server import/export wizard it was so easy, that it never been a concern, but when you try to do the same for azure sql environment, that’s a different story.

Problem

When importing bulk data, azure sql has some limitation. It do not facilitate bulk data import yet. So when I was in need of importing 2 million+ records to my Sql DB in azure, I was in trouble.

Few common options were

  1. generating data script from local sql server and run the same in azure
  2. from local sqlserver run a bulk insert query directly to azure sql

both options are miserably failed, as sqlserver started to show out of memory exception due to volume of data. One workaround is to spill data in small chunks and insert into diff temporary table, then generate data script from those table. But I do not like these kind of workaround anyway, because of the nature of effort and re-usability.

Solution

Solution was rather easy, have to use a command-line tool called The bcp utility. The bcp  client is installed when you install Microsoft SQL Server tools.

My requirement was to export 2millon+ data from local sql server, where database name is Enrollment and table name Enrollments

Local Sql Server Azure Sql
Database Enrollment azuredb
Table Name Enrollments Enrollments
Db UserName N/A username
Db pass N/A password

From command prompt and enter the following line of code:

BCP <databasename>.dbo.<tablename> OUT <localfolder>\<filename>.txt -T -c

According to my example:

BCP Enrollment.dbo.Enrollments OUT E:\BulkInsert.txt -T -c
You will see a nice text file generated with all data from the targeted table, now its time for import to azure sql

 BCP <databasename>.dbo.<tablename> IN <localfolder>\<filename>.txt -S <servername>;
 -U <username>@<servername> -P <password> -c

According to my example:

BCP azuredb.dbo.Enrollments IN E:\BulkInsert.txt -S myazure.database.windows.net; -U username@myazure.database.windows.net -P password -c

Entire process take few minutes, with simple bcp utility the import/export became a life saver for large bulk data manipulation. And with a batch file this can be turned to a reusable one too.

MVC Application In Azure Common Vulnerability Resolution

While working with enterprise grade asp.net applications, maintaining security is a key goal. And today’s development world its not about confirming standards or practices, its also  needed to confirming compliance requirement and ultimately leads to a good key to selling product.

In this particular post I am about to point out few solution of the vulnerability test specific for mcafeesecure compliance, but if you take a close look its nothing to do with a specific vendor compliance, but must do  items for any asp.net web application.

ASP.NET DEBUG Method Enabled Security Issue Vulnerability

Threat

ASP.NET debugging is enabled on the host. An attacker can send debug statements to the remote ASP scripts.

Solution

Disable debugging. From web.config of the target web application  modify the debug attribute to false

<compilation debug=”false”/>

Web Server Internal IP Address/Internal Network Name Disclosure Vulnerability

Threat

Asp.Net applications by default send information about the site with each response.CaptureA target host using HTTP may also be vulnerable to this issue.

Solution

Capture

Capture1

In web.config

<system.web>
  <httpRuntime enableVersionHeader="false" />
</system.web>

In Global.ascx

MvcHandler.DisableMvcResponseHeader = true;

Remote Management Service Accepting Unencrypted Credentials Detected

Threat

A remote management service(Telnet, FTP) accepts unencrypted credentials.A malicious user/bot can easily intercept unencrypted passwords during transmission and gain unauthorized access.

Solution

Use alternate services that provide encryption. Most of the common case of using ftp service within IIS. Solution to this problem is using SFTP instead of FTP.

Tools To Check Vulnerability Issues

asafaweb Security Analyser

free online service to test configuration of any SSL web server

Asp.NET MVC Performance Tuning Guidelines

In today’s world, every application is expected to perform faster & smoother, its a part of overall user experience(UX). With the advancement of technology and the infrastructure it widely expected that application will not only do the job perfectly but also within a reasonable time. No one now wants to wait 5 mins to load a page, or complete a job while using a system. When we are in such phase of UX evolution performance tuning is common software development practice.

In a simple description performance tuning is to identify the issues that slow down the application and apply few tricks to improve performance.
In this particular post, I will only focus performance tuning at web application. To tune any web application there is 3 Steps.

  1. Insight
  2. Tune client end
  3. Tune server end

Insight

In order to make your move, you need to know first what is the problem with your application. Where you need to optimize, what are your priorities. To get these insights you need tools. Some tools will provide you the insights for client end, rest is for server end performance. Selecting right set of tool and using them is the first step of performance tuning.

Tune client end

Speed up the response time is crucial for UX and impacts the satisfaction of the user. Few tips/tricks can help you speed up your web application’s response time. Most of this time is tied up in downloading all the components in the page, minimize this time is the key to faster response.

Tune server end

Server side tuning mainly focus on low latency. How we can speedup the job that application is doing now at server end. This may involve database calls, file operations or make an external API call or may be few complex but poorly constructed business logic is the culprit. Newer Asp.Net framework s (4.5/5) is shipped (or to be shipped) with lots of performance improvement techniques or features, question is how efficiently we can utilize them.

Tools To Get Insight

Glimpse : For client end insight

Visual Studio Profiling : Visual Studio tool for performance diagnosis.

Client Side Tuning

  • Use cookies for frequently accessed non sensitive information
  • Use HTML5 async attribute for script files, to enable async loading

Scripts.RenderFormat(@"", "~/bundles/jquery")

  • Use a Content Delivery Network (CDN)
  • Minify JavaScript and CSS
  • Avoid CSS expressions
  • Remove duplicate scripts
  • Make Ajax calls cacheable
  • Choose <link> over @import
  • Optimize images
  • Don’t scale images in HTML
  • Don’t put empty image src
  • Make favicon.ico small and cache-able

Server Side Tuning

  • Run your site in Release mode, not Debug mode
  • Reducing the number of requests to the server by bundling
  • Use Base64 Data URIs
  • Avoid passing null models to views
  • Do not use Sessions or tempdata – which uses sessions
  • Add gzip (HTTP compression) and static cache (images, css, …) in your web.config

<system.webServer>

<urlCompression doDynamicCompression="true" doStaticCompression="true" dynamicCompressionBeforeCache="true"/>

</system.webServer>

  • Tells the webserver to set the “Expires” header to a far future date, effectively telling the browser to cache things forever.
<staticContent>
 <clientCache httpExpires="Sun, 29 Mar 2020 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>
  • If you use Razor, add the following code in your global.asax.cs, by default, Asp.Net MVC renders with an aspx engine and a razor engine. This only uses the RazorViewEngine.

ViewEngines.Engines.Clear();

ViewEngines.Engines.Add(new RazorViewEngine());

  • Replace foreach/for loop with linq wherever possible.
  • Avoid multiple database calls to load related data that serves one request. Say you are to make 2 separate db calls to get chunk of users and total number of users for a request that do paging, marge those two request as one.
  • Use in-memory cache for non cloud application and distributed cache for cloud one’s.

Optimizing or improving performance is a continuous effort. Monitor the application regularly, think about user behavior/interaction and the responses to gain max result. And as a note….load testing is excellent way to find out your application behavior before clients find those.

Good Read

MongoDB Tip & Tricks

Commands

Update a specific field value for a collection:

Example: Update “CampaignId” field across entire collection

db.Prospects.update(
{}, {$set: {CampaignId: ObjectId('54ec891dc8efe23e3e0fb1ef')}}, { multi: true }
)

Remove a specific field value for a collection:

Example: Remove “CampaignId” field across entire collection

db.Prospects.update(
   { CampaignId: "54ec891dc8efe23e3e0fb1ef" },
   { $unset: { ZipCode: ""} }
)

Filter out documents from a collection based on list of Id’s, equivalent to SQL “IN”
Example: Retrieve documents with list of “ProductId”s across entire collection.

List ids=new List();
for(int index=0;index<Products.Count;index++)
{
  ids.Add(ObjectId.Parse(Products[index].Id));
}
var filter = Builders.Filter.Eq("ProductId", ids);
return await _database.GetCollection("AnalyticsSummary")
.Find(filter)
.ToListAsync().ConfigureAwait(false);

Rename a specific field value for a collection:
Example: Rename “CampaignId” field to “CampaignName” across entire collection

db.getCollection('CampaignSummary').update({},{ $rename: { "CampaignId": "CampaignName" } }, false, true )

The false, true in the method above are: { upsert:false, multi:true }. You need the multi:true to update all your records.

MongoDB Force a Member to Become Primary in a ReplicaSet

Problem Background

The primary is the only member in the replica set that receives write operations. If for some reason primary may becomes unavailable & an election determines the new primary. Which is not expected in few occasion. Election happens due to hardware issue or may be someone accidentally shutdown the primary which lead to this unwanted selection.On other note you may want one of your secondary to act as primary. You can achieve this forcefully and here is how you do it.

Solution To Problem

Say we have 3 note replica set, 3 physically separated machines.

192.168.1.0 – the current primary.
192.168.1.1 – a secondary.
192.168.1.2 – a secondary .

Our goal is to make 192.168.1.2 primary

From mongo shell, connect to current primary. use the following sequence of operations.

mongo 192.168.1.0:27017
rs.stepDown(120)

Now connect to a secondary. use the following sequence of operations.

mongo 192.168.1.1:27017
rs.freeze(120)

wait for 120 seconds

mongo 192.168.1.2:27017
rs.Status()

You will see machine with IP 192.168.1.2 is now became primary.

Solve Mongo DB Element does not match any field or property of class

Introduction

Due to non schema nature of mongodb collection, one collection may contains distinct number of fields.Its not recommended though, its possible to contain different number of fields due to business requirements. So for example say we have a collection “Client”, under which there is 2 document. Second document lacks 1 field “City”. So while binding back to POCO via c# driver , we get exception “Element does not match any field or property of class”.

sample-1

Solution

When a BSON document is deserialized back to POCO, the name of each element is used to look up a matching field or property in the class map, when deserializer do not find  the mapping property its throws an exception, that we have in this scenario. If we want to ignore those extra properties, there is two possible ways, during deserialization or during initialization of the BsonClassMap. I prefer the while I do the mapping, I live to keep my POCO clean.

sample-2

As you can see the above  code block,  SetIgnoreExtraElements methods is used to specify inside mapping to ignore extra elements from BSON document.

map.SetIgnoreExtraElements(true);

That’s it, now you can work with your POCO more easily.

How to solve The FastCGI process exited unexpectedly Error

Background

IIS with PHP is not new now days, there is lots of business needs that requires php applications need to host in IIS. Recent times PHP improved a lots in terms of integrating to IIS.Still few times few things fall apart, like the problem we are going to illustrate.

Problem

For a recent project I had to configure PHP with IIS 7.0 on Windows 2008 R2. Everything was seems fine, I even added the application pool with “No Managed Code”, configured handler mapping and sit tight to see php running,.

As soon as I hit http://localhost/phpinfo.php and boom, I see this error message.

error

 

 

Solution

After lots of googling I find the solution at last, solution was super easy, you have to install Visual C++ Redistributable for Visual Studio from here. Though it was specifically mentioned in the php download section, but I some how skipped that and wasted hours to find a solution.

Problem was PHP as FastCGI with IIS requires you to install the Non-Thread Safe (NTS) versions of PHP. And Recent releases binaries built with Visual Studio 2012 compiler , So while executing you need to have Visual C++ Redistributable for Visual Studio 2012 install in your server where you are trying to execute php scripts. And for proper installation follow these steps. Addition to these instructions you have to add an application pool with “No Managed Code” and Managed Pipeline Mode set to “Classic”. Then Go to “Advance Settings” of newly created application pool and Enable 32bit Application value to “True”