Sunday, October 3, 2010

Test Post

 

<parameter>
        <parameterName value="@log_date"/>
        <dbType value="DateTime"/>
        <layout type="log4net.Layout.RawTimeStampLayout"/>
</parameter>

Tuesday, September 14, 2010

Copy Files from FTP Server to Local or any other File System Server

Have you ever tried or worked on FTP files and you want to Copy them to specified drive of a server and then make some operations to it..?
If No, and Now you want to do the same or want to keep it for the future then this post is best for you.
I would recomment you to add it to your favorites for Future reference.

For simplicity, I will use Local Computer as FTP Server.
Step1: Create a FTP Folder. go to My Compute  then C:\inetput\ftproot\ and then create a Folder which will be treated as FTP folder. Note: IIS should be installed.
Step2: Save any TEXT File let say  basefile.txt which has some text.
Step3: Open the Visual Studio 2005/8/10 add a Console application.
Step4: Add a .cs File and rename the file as FTPFileTransfer.cs
Step5 Add Following code.


        public class  FTPFileTransfer
        {



           public FtpStatusCode Download(string destinationFile, Uri downloadUri, string userName, string password)
           {
               try
               {
                   // Check if the URI is and FTP site
                   if (downloadUri.Scheme != Uri.UriSchemeFtp)
                   {
                       throw new ArgumentException("Invalid FTP site");
                   }
                   // Set up the request
                   FtpWebRequest ftpRequest = (FtpWebRequest)WebRequest.Create(downloadUri);
                   // Credentials for FTP URI
                   ftpRequest.Credentials =
                   new NetworkCredential(userName, password);
                   // Set method as file download.
                   ftpRequest.Method =
                   WebRequestMethods.Ftp.DownloadFile;
                   // get the response object
                   FtpWebResponse  ftpResponse = (FtpWebResponse)ftpRequest.GetResponse();
                   Stream stream = null;
                   StreamReader reader = null;
                   StreamWriter writer = null;
                   // get the file as a stream from the response
                      object and write it as a file stream to the local disk
                   try
                   {
                       stream = ftpResponse.GetResponseStream();
                       reader = new StreamReader(stream, Encoding.UTF8);
                       writer = new StreamWriter(destinationFile, false);
                       writer.Write(reader.ReadToEnd());
                       return ftpResponse.StatusCode;
                   }
                   finally
                   {
                       // Close streams
                       stream.Close();
                       reader.Close();
                       writer.Close();
                   }
               }
               catch (Exception ex)
               {
                   throw ex;
               }
           }
        }


Now you can create the object of this class and Call Download Method.

Please put your comments about the article..

Wednesday, September 8, 2010

Few Things that we Know But we fail to Implement....

These thoughts are already written in many places and heard from many of us, But I am re-writing just to remind myself and all the people who are going to read this post... Just Wait for a moment ... think and try to implement maximum of it and life may become.. happier and easier to LIVE..




Anger is a condition in which
The tongue works faster than the mind .

*********

Harsh words break no bones
but they do break hearts.

*********

For every minute you are angry with someone,
you lose 60 seconds of happiness
that you can never get back.

*********

You can't change the past,
but you can ruin the present
By worrying over the future

*********

If you fill your heart with regrets
of yesterday and the worries of tomorrow,
you have no today to be thankful for.

*********

To get out of a difficulty,
one usually must go through it.

*********

The choice you make today
will usually affect tomorrow.

*********

All people smile in the same language.

*********

Laughter is God's sunshine

*********

Take time to laugh, for it is
the music of the soul.

*********

Love is strengthened by working
through conflicts together.

*********

Love is the only thing that can be
divided without being diminished.

*********

Everyone has beauty
but not everyone sees it.

*********

Man looks at outward appearance
but the Lord looks within.

*********

We take for granted the things
that we should be giving thanks for.

*********

Happiness is enhanced by others
but does not depend upon others.

*********

Thank God for what you have,
TRUST GOD for what you need.

*********

Do what you can, for who you can,
with what you have, and where you are.

*********

The real measure of a man's wealth
is what he has invested in eternity.

Wednesday, September 1, 2010

Understanding the Components of an Integration Services Package

In the last Post you have read about the SSIS Project Components and folders which includes SSIS package files, Now SSIS package is a collection on managed Control Flow, Data Flow and event handlers which can run independetly and can be scheduled with job to obtain the tasks.
In this post I will explain the components of an Integration Services packages in details and then every component will be dealt in different post separatly.
lets recap what we have dicussed about SSIS package in earlier post.

Now, majorly SSIS package has following components :
  1. Control Flow
  2. Data Flow
  3. Event handlers.
  4. Connections
Lets discuss about them in details:

Control Flow: SQL Server Integration Services provides three different types of control flow elements: containers that provide structures in packages, tasks that provide functionality, and precedence constraints that connect the executable, containers, and tasks into an ordered control flow.



Hence we can summarise that Control Flow component of SSIS has following elements :
  • Container.
  • Task.
  • Precedence constraints.
Now its time to know more about each element of Control Flow.

1. Container: Containers provide structure in packages and services to tasks in the control flow. Integration Services includes the following container types, for grouping tasks and implementing repeating control flows:
  • The Foreach Loop container enumerates a collection and repeats its control flow for each member of the collection.
  • The For Loop container repeats its control flow until a specified expression evaluates to False.
  • The Sequence container lets you define a subset of the control flow within a container and to manage tasks and containers as a unit
2. Task: Tasks do the work in packages. Integration Services includes tasks for performing a variety of functions.
  • The Data Flow task defines and runs data flows that extract data, apply transformations, and load data.
  • Data preparation tasks copy files and directories, download files and data, save data returned by Web methods, or work with XML documents.
  • Workflow tasks communicate with other processes to run packages or programs, send and receive messages between packages, send e-mail messages, read Windows Management Instrumentation (WMI) data, or watch for WMI events.
  • SQL Server tasks access, copy, insert, delete, or modify SQL Server objects and data.
  • Analysis Services tasks create, modify, delete, or process Analysis Services objects.
  • Scripting tasks extend package functionality through custom scripts.
  • Maintenance tasks perform administrative functions, such as backing up and shrinking SQL Server databases, rebuilding and reorganizing indexes, and running SQL Server Agent jobs.

3. Precedence Constraints: Precedence constraints connect containers and task in packages into an ordered control flow. You can control the sequence execution for tasks and containers, and specify conditions that determine whether tasks and containers run 
 
 
Now its time to learn about Data Flow element of SSIS Control Flow component.
 
Data Flow: SQL Server Integration Services provides three different types of data flow components: sources, transformations, and destinations. Sources extract data from data stores such as tables and views in relational databases, files, and Analysis Services databases. Transformations modify, summarize, and clean data. Destinations load data into data stores or create in-memory datasets.
 
 
Hence we can now summarise for the Data Flow control that it has majorly following elements:
  • Source
  • Transformation
  • Destination
  • Input Path
  • Out put Path
  • Error Output Patch
Lets see each element in details:

1. Source:
                In Integration Services, a source is the data flow component that makes data from different external data sources available to the other components in the data flow.  
  • The source for a data flow typically has one regular output. The regular output contains output columns, which are columns the source adds to the data flow.
  • The regular output references external columns. An external column is a column in the source. For example, the MadeFlag column in the Product table of the AdventureWorks database is an external column that can be added to the regular output. Metadata for external columns includes such information as the name, data type, and length of the source column.
  • An error output for a source contains the same columns as the regular output, and also contains two additional columns that provide information about errors. The Integration Services object model does not restrict the number of regular outputs and error outputs that sources can have. Most of the sources that Integration Services includes, except the Script component, have one regular output, and many of the sources have one error output. Custom sources can be coded to implement multiple regular outputs and error outputs.
2. Transformation:
                               The capabilities of transformations vary broadly. Transformations can perform tasks such as updating, summarizing, cleaning, merging, and distributing data.

  • The inputs and outputs of a transformation define the columns of incoming and outgoing data. Depending on the operation performed on the data, some transformations have a single input and multiple outputs, while other transformations have multiple inputs and a single output. Transformations can also include error outputs, which provide information about the error that occurred, together with the data that failed: for example, string data that could not be converted to an integer data type. The Integration Services object model does not restrict the number of inputs, regular outputs, and error outputs that transformations can contain. You can create custom transformations that implement any combination of multiple inputs, regular outputs, and error outputs.

3. Destinations:
                         A destination is the data flow component that writes the data from a data flow to a specific data store, or creates an in-memory dataset.
  • An Integration Services destination must have at least one input. The input contains input columns, which come from another data flow component. The input columns are mapped to columns in the destination.
  • Many destinations also have one error output. The error output for a destination contains output columns, which typically contain information about errors that occur when writing data to the destination data store. Errors occur for many different reasons. For example, a column may contain a null value, whereas the destination column cannot be set to null.

 

4. Inputs

              Destinations and transformations have inputs. An input contains one or more input columns, which can refer to external columns if the data flow component has been configured to use them. Inputs can be configured to monitor and control the flow of data: for example, you can specify if the component should fail in response to an error, ignore errors, or redirect error rows to the error output. You can also assign a description to the input or update the input name. In SSIS Designer, inputs are configured by using the Advanced Editor dialog box. For more information about the Advanced Editor, see Integration Services User Interface.

 
5. Outputs:
                  Sources and transformations always have outputs. An output contains one or more output columns, which can refer to external columns if the data flow component has been configured to use them. Outputs can be configured to provide information useful to downstream processing of the data. For example, you can indicate whether the output is sorted. You can also provide a description for the output, or update the output name. In SSIS Designer, outputs are configured by using the Advanced Editor dialog box.

 
6. Error Outputs
                          Sources, destinations, and transformations can include error outputs. You can specify how the data flow component responds to errors in each input or column by using the Configure Error Output dialog box. If an error or data truncation occurs at run time and the data flow component is configured to redirect rows, the data rows with the error are sent to the error output. By default, an error output contains the output columns and two error columns: ErrorCode and ErrorColumn. The output columns contain the data from the row that failed, ErrorCode provides the error code, and ErrorColumn identifies the failing column.

 
 
Connections
                     Microsoft SQL Server Integration Services packages use connections to perform different tasks and to implement Integration Services features:

  • Connecting to source and destination data stores such as text, XML, Excel workbooks, and relational databases to extract and load data.
  • Connecting to relational databases that contain reference data to perform exact or fuzzy lookups.
  • Connecting to relational databases to run SQL statements such as SELECT, DELETE, and INSERT commands and also stored procedures.
  • Connecting to SQL Server to perform maintenance and transfer tasks such as backing up databases and transferring logins.
  • Writing log entries in text and XML files and SQL Server tables and package configurations to SQL Server tables.
  • Connecting to SQL Server to create temporary work tables that some transformations require to do their work.
  • Connecting to Analysis Services projects and databases to access data mining models, process cubes and dimensions, and run DDL code.
  • Specifying existing or creating new files and folders to use with Foreach Loop enumerators and tasks.
  • Connecting to message queues and to Windows Management Instrumentation (WMI), SQL Server Management Objects (SMO), Web, and mail servers.

 
 
In This Section, we will see the type of the connections we can use in SSIS package:
  1. Connection Managers : Describes how connection managers work.
  2. Analysis Services Connection Manager: Describes the connection manager for connecting to an instance of Analysis Services or an Analysis Services project.
  3. ADO Connection Manager: Describes the connection manager for connecting to relational data sources by using ADO.
  4. ADO.NET Connection Manager: Describes the connection manager for connecting to relational data sources by using ADO.NET.
  5. Cache Connection Manager: Describes the connection manager for reading data from the data flow or from a cache file (.caw), and saving data to a cache file.
  6. Excel Connection Manager: Describes the connection manager for connecting to Excel workbooks.
  7. File Connection Manager: Describes the connection manager for connecting to a single file or folder.
  8. Flat File Connection Manager: Describes the connection manager for accessing data in a single flat file.
  9. FTP Connection Manager: Describes the connection manager for connecting to an FTP server.
  10. HTTP Connection Manager: Describes the connection manager for connecting to a Web service or Web site.
  11. MSMQ Connection Manager Describes the connection manager for connecting to a Message Queuing (also known as MSMQ) message queue.
  12. Multiple Files Connection Manager: Describes the connection manager for connecting to multiple files and folders.
  13. Multiple Flat Files Connection Manager: Describes the connection manager for accessing data in multiple flat files.
  14. ODBC Connection Manager: Describes the connection manager for connecting to data sources by using ODBC.
  15. OLE DB Connection Manager: Describes the connection manager for connecting to data sources by using OLE DB.
  16. SMO Connection Manager: Describes the connection manager for connecting to SQL Server Management Objects (SMO).
  17. SMTP Connection Manager: Describes the connection manager for connecting to SMTP servers.
  18. SQL Server Compact Edition Connection Manager: Describes the connection manager for connecting to SQL Server Compact databases.
  19. WMI Connection Manager: Describes the connection manager for connecting to a Windows Management Instrumentation (WMI) server and specifying a server namespace.
  20. Data Source (SSIS): Describes the data source object from which you can create a connection manager.
  21. Data Source View (SSIS) : Describes a view that you create based on a data source.

 Last but not the least
 Event handlers 
                         At run time, executable (packages and Foreach Loop, For Loop, Sequence, and task host containers) raise events. For example, an OnError event is raised when an error occurs. You can create custom event handlers for these events to extend package functionality and make packages easier to manage at run time.
 
Event handlers can perform tasks such as the following:
  • Clean up temporary data storage when a package or task finishes running.
  • Retrieve system information to assess resource availability before a package runs.
  • Refresh data in a table when a lookup in a reference table fails.
  • Send an e-mail message when an error or a warning occurs or when a task fails.
  • If an event has no event handler, the event is raised to the next container up the container hierarchy in a package. If this container has an event handler, the event handler runs in response to the event. If not, the event is raised to the next container up the container hierarchy.

 
Oh! We have got now huge Understanding of the complete overview of the SSIS package, Now Get ready for the actual hands on experience for the real time development.
In next Post I will start with the Control Flow Elements and Development.. be ready !!!
 
 


 



 

Tuesday, August 31, 2010

Creating a SSIS Solution and Understanding its Contents

As discussed in earlier post that SSIS (Sql Server Integratin Services) is a project driven,  hence we need to create a project for the same from BIDS (Business Intelligence Developement Studio),
for this Click on
START > Programs > Microsoft Sql Server 2008 > Sql Server Business Intelligence Developement Studio
Graphically it looks like :


 Now we have created a Project that Looks like :


Understanding the Contents of Integration Services Projects

Now we will discuss the same in details about hte Project explorer of SSIS package.
An Integration Services project manages the object definitions of data sources, data source views, and packages. Separate folders within an Integration Services project store the files for data sources, data source views, and packages



As you can see above there are four main folders:

1. Data Sources
A data source is a connection reference that you create outside a package. A data source represents a simple connection to a data store, which includes all tables and views in the data store.
The advantages of using data sources in Integration Services packages include the following:
  1. A data source has project scope, which means that a data source created in an Integration Services project is available to all the packages in the project. A data source can be defined one time and then referenced by connection managers in multiple packages.
  2. A data source offers synchronization between the data source object and its package references. If the data source and the packages that reference it reside in the same project, the connection string property of the data source references is automatically updated when the data source changes.
To add a data source reference
  1. In Business Intelligence Development Studio, open the Integration Services project that contains the package you want.
  2. In Solution Explorer, double-click the package to open it.
  3. To make the Connection Managers area available, click the Control Flow, the Data Flow, or the Event Handlers tab in SSIS Designer.
  4. Right-click within the Connection Managers area and then click New Connection From Data Source. The Select Data Source dialog box opens, listing the available data source objects.
  5. Click the data source object you want to use.
  6. Click OK.

2. Data Source Views

 A data source view is a document that describes the schema of an underlying data source. A data source view provides a named, browseable, persisted selection of database objects that can be used to define sources, destinations, and lookup tables for SQL Server Integration Services tasks, transformations, data sources, and destinations. In Integration Services, a data source view is a design-time object that makes it easier to implement the same data source in multiple packages.
 
3. SSIS Packages

 A package is an organized collection of connections, control flow elements, data flow elements, event handlers, variables, and configurations, that you assemble using either the graphical design tools that SQL Server Integration Services provides, or build programmatically. You then save the completed package to SQL Server, the SSIS Package Store, or the file system. The package is the unit of work that is retrieved, executed, and saved.
 
When you first create a package, it is an empty object that does nothing. To add functionality to a package, you add a control flow and, optionally, one or more data flows to the package.

The following diagram shows a simple package that contains a control flow with a Data Flow task, which in turn contains a data flow

4. Miscellaneous
Contains files other than source, data source view, or package files.

 

This is sufficient for this class and I will describe in detail about the Packages and its parts like Control Flow, Data Flow and Event Handlers... till then keep in touch and your remarks will highly be appreciated..

 
 

Monday, August 30, 2010

Basics of SSIS (Sql Server Integration Services)

With the Release of the SQL Server 2005 a new and Advance form of the DTS has been launched and it is named as SQL Server Integration Service, Basically SSIS is a complete ETL (Extraction Transformation and Loading) tool.
It is very interesting to know that without writing even a single line of Code you can perform many complex tasks like :
  • Data Bulk-Insert from Flat file to Sql Server Database.
  • Email Notifications.
  • Manipulation of the data.
  • Data Transfer from Hetreogenous Data sources to SQL Server database.
  • SQL Server Administrative task and much more..
There are major three main modules of the SSIS
  1. Control Flow
  2. Data Flow
  3. Event handlers
Each of them play a vital role in the development of the ETL operations required for your organization.
Other Component of the SSIS are :
  • Connection object. This is mainly for the setting connections to the Destination and Source Database.
  • Transformation Objects:  These tasks objects helps you to convert the source data from one form to another in desired format.
  • Source and Destination Databases/Flat Files: This is self explanatory.
  • Notification Tasks:  Like Email Notification task etc.
  • FTP task and much more..
This was just to give you the glimpse of the SSIS : A basic Introduction., In later blogs you will find detailed analysis of each and every component.

Putting Message in the MSMQ through the C#.net

Before I start with the Actual Code used to Write and Read a Message, I would like to give brief overview on the MSMQ

Microsoft Message Queuing or MSMQ is a Message Queue implementation developed by Microsoft and deployed in its Windows Server operating systems since Windows NT 4 and Windows 95. The latest Windows 7 also includes this component. In addition to its mainstream server platform support, MSMQ has been incorporated into Microsoft Embedded platforms since 1999 and the release of Windows CE 3.0.
MSMQ is essentially a messaging protocol that allows applications running on separate servers/processes to communicate in a failsafe manner. A queue is a temporary storage location from which messages can be sent and received reliably, as and when conditions permit. This enables communication across heterogeneous networks and between computers which may not always be connected. By contrast, sockets and other network protocols assume that direct connections always exist.


MSMQ has been available to developers on Microsoft platforms since 1997,[2] and has commonly been used in enterprise software built with Visual Studio, both in the native pre-.NET incarnation (version 5 and 6), and in Visual Studio .NET. Microsoft also has incorporated MSMQ in its messaging technology framework, Windows Communication Foundation (WCF). Under WCF, MSMQ can be used to provide secure, reliable transport with a unified programming model compatible with other communications standards.

MSMQ is responsible for reliably delivering messages between applications inside and outside the enterprise. MSMQ ensures reliable delivery by placing messages that fail to reach their intended destination in a queue and then resending them once the destination is reachable. It also supports security and priority based messaging. Dead letter queues can be created for looking at messages which timed out or failed for other reasons.

Following is the Code:
public class MSMQ_rw