Historically, data has been transferred in a number of ways via FTP, SFTP, and other ways interactively and in an automated fashion. For instance, (S)FTP scripts allow you to write and execute a chain of commands, as required. Interactive scripts, which are used for a short and easy transfers, require you to input commands in the script, each time during the execution process. However, in order to do a whole bunch of file transfers, you will need to implement an automated file transfer system. By developing an automated file transfer process, you will eliminate repetitive work, avoid the need for an employee to spend time entering the same data and commands again and again.
File Transfer Automation: The Blips you may Encounter
IT industry experts estimate that about 50 to 70 percent of the file transfers in the cloud happen in batch mode. This forces organisational reliance on technology and automation of business processes. But electronic file transfers over a network are not smooth and there are many transfer-blips that must be identified and handled. This may also be a good starting point for evaluating whether the cloud backup and storage service provider has handled the problems associated with batch transfers of workloads over the Internet.
Batch workload transfer automation must be an end to end process. The IT administrator overseeing or scheduling the automated transfer must have the facility to create, automate, monitor and manage the batch process through the entire transfer cycle.
At the core of the concept is logistics management. Data packages may travel from disparate or even geographically scattered systems. Some of the data may be pushed into the organization’s computing system from the web or from suppliers and third parties to the organisation. The workload process must combine these disparate packages and deliver the consolidated package to the data transmission and storage point to be processed. Processing may involve encryption, compression, deduplication and evaluation of the data in the batch being readied for transmission to the remote system in the cloud. Missed transfers, delays and errors have to be captured, reported, monitored and data has to be re-transferred to the online cloud based server.
It is evident from the above discussion that batch data transfer as per schedule implies that there are multiple iterations and swivelling between multiple computing systems within a network before the package is considered ready for transmission. The process can be very complex and will have to be closely monitored. A well run cloud backup, online storage and disaster recovery vendor should have all the systems in place to provide a seamless transfer of data to the cloud.
There are six characteristics that every organisation must look for when signing up for a cloud based online backup and storage process that uses batch mode:
- The cloud service must provide a unified enterprise view to the administrator for effective monitoring of the disparate system that connects over the network.
- The cloud vendor software must have “a right out of the box solution” to distinguish between a successful transfer and a failed transfer and alert the IT Administrator by auto generated alerts, email messages or dashboard status reports.
- The solution must support standards based popular file transfer protocols and all platforms used by the enterprise.
- The software interface must be user friendly.
- Privacy and security must never be compromised.
- The tightly integrated cloud backup solution must be able to meet the service levels agreed upon in the Service Level Agreement (SLA).
Now that we have established how data should be handled in a batch mode, the question remains on how to enforce people processes when accessing data in the cloud and in the case of data recovery.
As we all know the cloud does all of the following:
- Permits the upload of files and folders by users from anywhere, anytime and from any kind of device.
- Permits real time access to files and folders from anywhere, anytime and on any kind of device
- Collaborate on files and folders with anyone on the network or outside the network.
- Share files and folders with anyone on the network or outside the network.
- Permits restoration of files to same, similar or new hardware anywhere, anytime.
There are a number of administrative and security issues that are instantly visible:
- Files have to be organised hierarchically and tagged for easy view, search and retrieval.
- The IT Administrator must have the facility to provide role-based access to the files and folders using some kind of user management system.
- Notifications and alerts must be generated whenever a file or folder is shared or assigned on a collaborative project.
- Notifications and alerts must also be received by the administrator whenever an attempt is made to access a file without requisite permissions being available.
- Audit trials must be maintained for verification and action whenever required.
- Audit logs must be maintained for verification and action whenever required.
Cloud service providers are conscious that the cloud must enforce people process automation in order to maintain security of the enterprise data throughout its life cycle. They do it in the following ways:
- Most cloud based services integrate elaborate user management systems into their software. The authentication and authorisation servers are maintained separate from the database servers. Access to the data is provided only when the user enters the correct user id and password assigned within the enterprise storage account management system by the IT Administrator. If the user is authenticated, the system allows operations on the data in accordance with the rights and permissions assigned to the user. Else the data will remain inaccessible. However, the system will not be able to distinguish between the authorised/authenticated user and others if passwords are indiscriminately shared by users.
- Cloud service providers create an additional layer of security by implementing elaborate cryptographic algorithms for encryption of data that is transmitted and stored by customers in their servers. Hackers attempting to view the data by alternate routes will be confronted with 256-bit impregnably encrypted versions of the data. Without the Key (which is generated and managed by the IT Administrator), the hacker can neither view the data nor hijack it.
If you are migrating to the cloud, check whether the above people process automation systems are being placed at your disposal!
Backup Everything is a UK based cloud backup and storage firm that handles data batches seamlessly, providing securely encrypted services, with 256-bit impregnably encryption technology. Backup Everything has done all it can to educate its clients, and is very much conscious that the cloud must enforce people process automation in order to maintain security of data under its management. If you have any question or you need a helping hand to transfer your data to the cloud, please contact us for further information.