Tuesday, September 14, 2010

Copy Files from FTP Server to Local or any other File System Server

Have you ever tried or worked on FTP files and you want to Copy them to specified drive of a server and then make some operations to it..?
If No, and Now you want to do the same or want to keep it for the future then this post is best for you.
I would recomment you to add it to your favorites for Future reference.

For simplicity, I will use Local Computer as FTP Server.
Step1: Create a FTP Folder. go to My Compute  then C:\inetput\ftproot\ and then create a Folder which will be treated as FTP folder. Note: IIS should be installed.
Step2: Save any TEXT File let say  basefile.txt which has some text.
Step3: Open the Visual Studio 2005/8/10 add a Console application.
Step4: Add a .cs File and rename the file as FTPFileTransfer.cs
Step5 Add Following code.


        public class  FTPFileTransfer
        {



           public FtpStatusCode Download(string destinationFile, Uri downloadUri, string userName, string password)
           {
               try
               {
                   // Check if the URI is and FTP site
                   if (downloadUri.Scheme != Uri.UriSchemeFtp)
                   {
                       throw new ArgumentException("Invalid FTP site");
                   }
                   // Set up the request
                   FtpWebRequest ftpRequest = (FtpWebRequest)WebRequest.Create(downloadUri);
                   // Credentials for FTP URI
                   ftpRequest.Credentials =
                   new NetworkCredential(userName, password);
                   // Set method as file download.
                   ftpRequest.Method =
                   WebRequestMethods.Ftp.DownloadFile;
                   // get the response object
                   FtpWebResponse  ftpResponse = (FtpWebResponse)ftpRequest.GetResponse();
                   Stream stream = null;
                   StreamReader reader = null;
                   StreamWriter writer = null;
                   // get the file as a stream from the response
                      object and write it as a file stream to the local disk
                   try
                   {
                       stream = ftpResponse.GetResponseStream();
                       reader = new StreamReader(stream, Encoding.UTF8);
                       writer = new StreamWriter(destinationFile, false);
                       writer.Write(reader.ReadToEnd());
                       return ftpResponse.StatusCode;
                   }
                   finally
                   {
                       // Close streams
                       stream.Close();
                       reader.Close();
                       writer.Close();
                   }
               }
               catch (Exception ex)
               {
                   throw ex;
               }
           }
        }


Now you can create the object of this class and Call Download Method.

Please put your comments about the article..

Wednesday, September 8, 2010

Few Things that we Know But we fail to Implement....

These thoughts are already written in many places and heard from many of us, But I am re-writing just to remind myself and all the people who are going to read this post... Just Wait for a moment ... think and try to implement maximum of it and life may become.. happier and easier to LIVE..




Anger is a condition in which
The tongue works faster than the mind .

*********

Harsh words break no bones
but they do break hearts.

*********

For every minute you are angry with someone,
you lose 60 seconds of happiness
that you can never get back.

*********

You can't change the past,
but you can ruin the present
By worrying over the future

*********

If you fill your heart with regrets
of yesterday and the worries of tomorrow,
you have no today to be thankful for.

*********

To get out of a difficulty,
one usually must go through it.

*********

The choice you make today
will usually affect tomorrow.

*********

All people smile in the same language.

*********

Laughter is God's sunshine

*********

Take time to laugh, for it is
the music of the soul.

*********

Love is strengthened by working
through conflicts together.

*********

Love is the only thing that can be
divided without being diminished.

*********

Everyone has beauty
but not everyone sees it.

*********

Man looks at outward appearance
but the Lord looks within.

*********

We take for granted the things
that we should be giving thanks for.

*********

Happiness is enhanced by others
but does not depend upon others.

*********

Thank God for what you have,
TRUST GOD for what you need.

*********

Do what you can, for who you can,
with what you have, and where you are.

*********

The real measure of a man's wealth
is what he has invested in eternity.

Wednesday, September 1, 2010

Understanding the Components of an Integration Services Package

In the last Post you have read about the SSIS Project Components and folders which includes SSIS package files, Now SSIS package is a collection on managed Control Flow, Data Flow and event handlers which can run independetly and can be scheduled with job to obtain the tasks.
In this post I will explain the components of an Integration Services packages in details and then every component will be dealt in different post separatly.
lets recap what we have dicussed about SSIS package in earlier post.

Now, majorly SSIS package has following components :
  1. Control Flow
  2. Data Flow
  3. Event handlers.
  4. Connections
Lets discuss about them in details:

Control Flow: SQL Server Integration Services provides three different types of control flow elements: containers that provide structures in packages, tasks that provide functionality, and precedence constraints that connect the executable, containers, and tasks into an ordered control flow.



Hence we can summarise that Control Flow component of SSIS has following elements :
  • Container.
  • Task.
  • Precedence constraints.
Now its time to know more about each element of Control Flow.

1. Container: Containers provide structure in packages and services to tasks in the control flow. Integration Services includes the following container types, for grouping tasks and implementing repeating control flows:
  • The Foreach Loop container enumerates a collection and repeats its control flow for each member of the collection.
  • The For Loop container repeats its control flow until a specified expression evaluates to False.
  • The Sequence container lets you define a subset of the control flow within a container and to manage tasks and containers as a unit
2. Task: Tasks do the work in packages. Integration Services includes tasks for performing a variety of functions.
  • The Data Flow task defines and runs data flows that extract data, apply transformations, and load data.
  • Data preparation tasks copy files and directories, download files and data, save data returned by Web methods, or work with XML documents.
  • Workflow tasks communicate with other processes to run packages or programs, send and receive messages between packages, send e-mail messages, read Windows Management Instrumentation (WMI) data, or watch for WMI events.
  • SQL Server tasks access, copy, insert, delete, or modify SQL Server objects and data.
  • Analysis Services tasks create, modify, delete, or process Analysis Services objects.
  • Scripting tasks extend package functionality through custom scripts.
  • Maintenance tasks perform administrative functions, such as backing up and shrinking SQL Server databases, rebuilding and reorganizing indexes, and running SQL Server Agent jobs.

3. Precedence Constraints: Precedence constraints connect containers and task in packages into an ordered control flow. You can control the sequence execution for tasks and containers, and specify conditions that determine whether tasks and containers run 
 
 
Now its time to learn about Data Flow element of SSIS Control Flow component.
 
Data Flow: SQL Server Integration Services provides three different types of data flow components: sources, transformations, and destinations. Sources extract data from data stores such as tables and views in relational databases, files, and Analysis Services databases. Transformations modify, summarize, and clean data. Destinations load data into data stores or create in-memory datasets.
 
 
Hence we can now summarise for the Data Flow control that it has majorly following elements:
  • Source
  • Transformation
  • Destination
  • Input Path
  • Out put Path
  • Error Output Patch
Lets see each element in details:

1. Source:
                In Integration Services, a source is the data flow component that makes data from different external data sources available to the other components in the data flow.  
  • The source for a data flow typically has one regular output. The regular output contains output columns, which are columns the source adds to the data flow.
  • The regular output references external columns. An external column is a column in the source. For example, the MadeFlag column in the Product table of the AdventureWorks database is an external column that can be added to the regular output. Metadata for external columns includes such information as the name, data type, and length of the source column.
  • An error output for a source contains the same columns as the regular output, and also contains two additional columns that provide information about errors. The Integration Services object model does not restrict the number of regular outputs and error outputs that sources can have. Most of the sources that Integration Services includes, except the Script component, have one regular output, and many of the sources have one error output. Custom sources can be coded to implement multiple regular outputs and error outputs.
2. Transformation:
                               The capabilities of transformations vary broadly. Transformations can perform tasks such as updating, summarizing, cleaning, merging, and distributing data.

  • The inputs and outputs of a transformation define the columns of incoming and outgoing data. Depending on the operation performed on the data, some transformations have a single input and multiple outputs, while other transformations have multiple inputs and a single output. Transformations can also include error outputs, which provide information about the error that occurred, together with the data that failed: for example, string data that could not be converted to an integer data type. The Integration Services object model does not restrict the number of inputs, regular outputs, and error outputs that transformations can contain. You can create custom transformations that implement any combination of multiple inputs, regular outputs, and error outputs.

3. Destinations:
                         A destination is the data flow component that writes the data from a data flow to a specific data store, or creates an in-memory dataset.
  • An Integration Services destination must have at least one input. The input contains input columns, which come from another data flow component. The input columns are mapped to columns in the destination.
  • Many destinations also have one error output. The error output for a destination contains output columns, which typically contain information about errors that occur when writing data to the destination data store. Errors occur for many different reasons. For example, a column may contain a null value, whereas the destination column cannot be set to null.

 

4. Inputs

              Destinations and transformations have inputs. An input contains one or more input columns, which can refer to external columns if the data flow component has been configured to use them. Inputs can be configured to monitor and control the flow of data: for example, you can specify if the component should fail in response to an error, ignore errors, or redirect error rows to the error output. You can also assign a description to the input or update the input name. In SSIS Designer, inputs are configured by using the Advanced Editor dialog box. For more information about the Advanced Editor, see Integration Services User Interface.

 
5. Outputs:
                  Sources and transformations always have outputs. An output contains one or more output columns, which can refer to external columns if the data flow component has been configured to use them. Outputs can be configured to provide information useful to downstream processing of the data. For example, you can indicate whether the output is sorted. You can also provide a description for the output, or update the output name. In SSIS Designer, outputs are configured by using the Advanced Editor dialog box.

 
6. Error Outputs
                          Sources, destinations, and transformations can include error outputs. You can specify how the data flow component responds to errors in each input or column by using the Configure Error Output dialog box. If an error or data truncation occurs at run time and the data flow component is configured to redirect rows, the data rows with the error are sent to the error output. By default, an error output contains the output columns and two error columns: ErrorCode and ErrorColumn. The output columns contain the data from the row that failed, ErrorCode provides the error code, and ErrorColumn identifies the failing column.

 
 
Connections
                     Microsoft SQL Server Integration Services packages use connections to perform different tasks and to implement Integration Services features:

  • Connecting to source and destination data stores such as text, XML, Excel workbooks, and relational databases to extract and load data.
  • Connecting to relational databases that contain reference data to perform exact or fuzzy lookups.
  • Connecting to relational databases to run SQL statements such as SELECT, DELETE, and INSERT commands and also stored procedures.
  • Connecting to SQL Server to perform maintenance and transfer tasks such as backing up databases and transferring logins.
  • Writing log entries in text and XML files and SQL Server tables and package configurations to SQL Server tables.
  • Connecting to SQL Server to create temporary work tables that some transformations require to do their work.
  • Connecting to Analysis Services projects and databases to access data mining models, process cubes and dimensions, and run DDL code.
  • Specifying existing or creating new files and folders to use with Foreach Loop enumerators and tasks.
  • Connecting to message queues and to Windows Management Instrumentation (WMI), SQL Server Management Objects (SMO), Web, and mail servers.

 
 
In This Section, we will see the type of the connections we can use in SSIS package:
  1. Connection Managers : Describes how connection managers work.
  2. Analysis Services Connection Manager: Describes the connection manager for connecting to an instance of Analysis Services or an Analysis Services project.
  3. ADO Connection Manager: Describes the connection manager for connecting to relational data sources by using ADO.
  4. ADO.NET Connection Manager: Describes the connection manager for connecting to relational data sources by using ADO.NET.
  5. Cache Connection Manager: Describes the connection manager for reading data from the data flow or from a cache file (.caw), and saving data to a cache file.
  6. Excel Connection Manager: Describes the connection manager for connecting to Excel workbooks.
  7. File Connection Manager: Describes the connection manager for connecting to a single file or folder.
  8. Flat File Connection Manager: Describes the connection manager for accessing data in a single flat file.
  9. FTP Connection Manager: Describes the connection manager for connecting to an FTP server.
  10. HTTP Connection Manager: Describes the connection manager for connecting to a Web service or Web site.
  11. MSMQ Connection Manager Describes the connection manager for connecting to a Message Queuing (also known as MSMQ) message queue.
  12. Multiple Files Connection Manager: Describes the connection manager for connecting to multiple files and folders.
  13. Multiple Flat Files Connection Manager: Describes the connection manager for accessing data in multiple flat files.
  14. ODBC Connection Manager: Describes the connection manager for connecting to data sources by using ODBC.
  15. OLE DB Connection Manager: Describes the connection manager for connecting to data sources by using OLE DB.
  16. SMO Connection Manager: Describes the connection manager for connecting to SQL Server Management Objects (SMO).
  17. SMTP Connection Manager: Describes the connection manager for connecting to SMTP servers.
  18. SQL Server Compact Edition Connection Manager: Describes the connection manager for connecting to SQL Server Compact databases.
  19. WMI Connection Manager: Describes the connection manager for connecting to a Windows Management Instrumentation (WMI) server and specifying a server namespace.
  20. Data Source (SSIS): Describes the data source object from which you can create a connection manager.
  21. Data Source View (SSIS) : Describes a view that you create based on a data source.

 Last but not the least
 Event handlers 
                         At run time, executable (packages and Foreach Loop, For Loop, Sequence, and task host containers) raise events. For example, an OnError event is raised when an error occurs. You can create custom event handlers for these events to extend package functionality and make packages easier to manage at run time.
 
Event handlers can perform tasks such as the following:
  • Clean up temporary data storage when a package or task finishes running.
  • Retrieve system information to assess resource availability before a package runs.
  • Refresh data in a table when a lookup in a reference table fails.
  • Send an e-mail message when an error or a warning occurs or when a task fails.
  • If an event has no event handler, the event is raised to the next container up the container hierarchy in a package. If this container has an event handler, the event handler runs in response to the event. If not, the event is raised to the next container up the container hierarchy.

 
Oh! We have got now huge Understanding of the complete overview of the SSIS package, Now Get ready for the actual hands on experience for the real time development.
In next Post I will start with the Control Flow Elements and Development.. be ready !!!