Intelligent CIO Africa Issue 96 | Page 69

t cht lk

t cht lk

Data resilience is an all-encompassing mission that covers identity management , device and network security , and data protection principles like backup and recovery . It is a massive de-risking project , but for it to be effective it requires two things above all else : the already-mentioned visibility , and senior buy-in .
Data resilience starts in the boardroom . Without it , projects fall flat , funding limits how much can be done , and protection , availability gaps appear . The fatal not my problem , cannot fly anymore . Do not let the size of the task stop you from starting . You cannot do everything , but you can do something , and that is infinitely better than doing nothing . Starting now will be much easier than starting in a year when LLMs have sprung up across the organisation .
Many companies may fall into the same issues as they did with cloud migration all those years ago , you go allin on the new technology and end up wishing you had planned some things ahead , rather than having to work backwards . Test your resilience by doing drills , the only way to learn how to swim is by swimming .
When testing , make sure you have some realistic worst-case scenarios . Try doing it without your disaster lead , they are allowed to go on holiday , after all . Have a plan B , C , and D . By doing these tests , it is easy to see how prepped you are . The most important thing is to start .
Data freedom and data integrity
The movement of data poses one of the most significant risks to data integrity , with the lack of pre-migration testing as the main cause of issues such as data corruption and data loss . This can cause unexpected downtime , reputational damage and loss of important information .
Data integrity begins with awareness . Many organisations do not fully understand what data they have , when it was added or what was updated over time , making it challenging to conduct data audits or integrity checks . Building awareness of data assets is the first step towards validating data and detecting abnormalities based on historical analyses .
Then , rigorous and ongoing testing for migration is crucial . This includes testing for both functionality and economics . Functionality refers to how well the system operates after migration , ensuring that it continues to meet expectations ; economics refers to the costeffectiveness of the system or application , which is particularly important with cloud-based migrations .
Economics testing involves examining resource consumption , service costs and overall scalability to ascertain whether the solution is economically sustainable for the business .
Rick Vanover , Vice President of Product Strategy , Veeam
Organisations must liken preparing for migration to how pilots train to resolve the unexpected . By planning for the potential problems businesses may encounter during the transfer of data across systems and platforms , the risk and impact of compromised data can be minimised .
Most importantly , companies should prepare for migrations even if they do not anticipate immediate changes . Just as pilots do not wait for poor flying conditions to train for an emergency landing or response , businesses also should not wait to be notified of imminent change to initiate data checks and testing . The volatile and fast-paced technological environment means we need to always be prepared to avoid being caught off-guard .
Data freedom is not just about having the ability to move data , it is about ensuring data remains accurate , secure , and usable during migrations or platform changes . Regular testing and data assessments help maintain both integrity and freedom , ensuring businesses can rely on their data when it matters most . p
www . intelligentcio . com INTELLIGENTCIO AFRICA 69