Quantum has developed a new NAS data and storage management platform called Quantum ATFS, All-Terrain File System. The system integrates real-time data classification and insights with the needs of users’ applications, which together determine how storage resources are allocated and consumed. Organisations can use data insights to visualise data without the constraints of a conventional file system, automate the placement of data within the system based on policies and optimise resources using just-in-time data movement policies.
Quantum believes that guesswork is not part of sound data management within an organisation – that is, guessing at capacity and where data is located, or manually searching through file systems to find files the organisation needs. Also, users may be uncertain about what they can delete and when they can delete it. The result is silos of data and loss of control and visibility.
In the ATFS, data is placed to meet the performance, resiliency, availability and access demands of applications and workflows, at the time it is needed. The resulting end-user experience is consistent whether deploying resources on premise or in the cloud, and for different applications.
When Quantum bought Atavium storage and data workflow specialists earlier in 2020, Quantum used Atavium’s developments to reorganise its engineering into primary and secondary storage capabilities. Specifically, Atavium software identifies and manages data and optimises data movement through processing workflows. It performs search and policy-based tiering – flash to disk to S3 cloud – and uses automatic (zero-touch) tagging to mark files as they pass through workflow stages.
ATFS workflow process
ATFS uses data classification, metadata and business-specific tagging to keep the system efficient and optimise storage resource consumption, removing the need for an organisation to pay extra for good performance. It is designed to manage hardware resources as a service to the application, when and where it is needed, at scale.
At 5 Guys Named Moe post-production facility, or example, their ATFS system is the backbone of their extremely high bandwidth cloud data migration work. Their Head of Post-Production Eric A Reid said, “We use the metadata tagging built into the ATFS platform to prioritise data efficiently, which makes sure that NVMe space is automatically allocated to the most resource intensive tasks without any manual input. This reduces costs overall and means we have the right balance of storage space and performance available for workflows.”
ATFS ingests data, which can be placed into flash, bulk or the cloud based on policies, application-defined tags, or manually. Performance may be tuned based on the size of the active data set. Automated policies place data just-in-time to support workloads, improving efficiency per unit of storage.
Automating data classification and placement under ATFS can serve use cases like automating application workflows. It can be integrated with asset management tools, schedulers and other software to automate tasks using APIs in life sciences, media and entertainment and finance.
For active data retention set-ups, the ATFS metadata and tags simplify access to data over time. Enterprises deploying resources in the cloud can burst into the cloud using cloud-based applications, or use ATFS for large data set retention. For collaboration across an organisation and externally, teams can share data securely, without creating duplicates.
ATFS is also useful for controlling data and recording the inputs and processes that influence it. ATFS classification and placement can be made to execute on retention, protection and access guidelines, per regulations and best practices.
ATFS is available to order beginning December 2020 as software installed on a Quantum appliance. www.quantum.com