May 14, 2023

May 14, 2023

How to Balance Data Governance with Data Democracy

Data democratization is important to an organization because it ensures an effective and efficient method of providing all users, regardless of technical expertise, the ability to analyze readily accessible and reliable data to influence data-driven decisions and drive real-time insights. This eliminates the frustration of requesting access, sorting information, or reaching out to IT for help. ... The solution to this problem lies in data federation, which makes data from multiple sources accessible under a uniform data model. This model acts as a "single point of access" such that organizations create a virtual database where data can be accessed where it already lives. This makes it easier for organizations to query data from different sources in one place. With a single point of access, users can go to one location for searching, finding, and accessing every piece of data your organization has. This will make it easier to democratize data access because you won’t need to facilitate access across many different sources.


Will ChatGPT and Generative AI “Replace” Testing?

It stands to reason, then, that ChatGPT and generative AI will not "replace" testing or remove the need to invest in QA. Instead, like test execution automation before it, generative AI will provide a useful tool for moving faster. Yet, there will always be a need for more work, and at least a constant (if not greater) need for human input. Testers' time might be applied less to repetitive tasks like scripting, but new processes will fill the void. Meanwhile, the creativity and critical thinking offered by testers will not diminish in value as these repetitive processes are automated; such creativity should be given greater freedom. At the same time, your testers will have vital insight into how generative AI should be used in your organization. Nothing is adopted overnight, and identifying the optimal applications of tools like ChatGPT will be an ongoing conversation, just as the testing community has continually explored and improved practices for getting the most out of test automation frameworks. Lastly, as the volume of possible test scenarios grows, automation and AI will need a human steer in knowing where to target its efforts, even as we can increasingly use data to target test generation.


How agtech is poised to transform India into a farming powerhouse

Collaboration will be crucial. While agtechs might facilitate better decision making and replace manual farming practices like spraying, reducing dependence on retailers and mandis, incumbents remain important in the new ecosystem for R&D and the supply of chemicals and fertilizers. There are successful platforms already emerging that offer farmers an umbrella of products and services to address multiple, critical pain points. These one-stop shop agri-ecosystems are also creating a physical backbone/supply chain—which makes it easier for incumbents and start-ups to access the fragmented farmer base. Agtechs have a unique opportunity to become ideal partners for companies seeking market access. In this scenario, existing agriculture companies are creating value for the farmer by having more efficient and cost-effective access to the farmer versus traditional manpower-intensive setups. It’s a system that builds: the more agtechs know the farmer, the better products they can develop. India’s farms have been putting food on the table for India and the world for decades. 


How A Non Data Science Person Can Work Effectively With A Data Scientist

Effective communication is essential for a successful partnership. The data scientist should communicate technical procedures and conclusions in a clear and concise manner. In contrast, the non-data science person should communicate business requirements and limitations. Both sides can collaborate successfully by developing a clear understanding of the project objectives and the data science methodologies. Setting expectations and establishing the project’s scope from the beginning is equally critical. The non-data scientist should specify what they expect from the data scientist, including the results they intend to achieve and the project’s schedule. In return, they should describe their areas of strength and the achievable goals that fall within the project’s parameters. It is crucial to keep the lines of communication open and transparent throughout the process. Regular meetings and status reports should be organized to keep everyone informed of the project’s progress and to identify any potential issues.


Why Metadata Is a Critical Asset for Storage and IT Managers

Advanced metadata is handled differently by file storage and object storage environments. File storage organizes data in directory hierarchies, which means you can’t easily add custom metadata attributes. ... Metadata is massive because the volume and variety of unstructured data – files and objects – are massive and difficult to wrangle. Data is spread across on-premises and edge data centers and clouds and stored in potentially many different systems. To leverage metadata, you first need a process and tools for managing data. Managing metadata requires both strategy and automation; choosing the best path forward can be difficult when business needs are constantly changing and data types may also be morphing from the collection of new data types such as IoT data, surveillance data, geospatial data, and instrument data. Managing metadata as it grows can also be problematic. Can you have too much? One risk is a decrease in file storage performance. Organizations must consider how to mitigate this; one large enterprise we know switched from tagging metadata at the file level to the directory level.


Understand the 3 major approaches to data migration

Application data migration—sometimes called logical data migration or transaction-level migration—is a migration approach that utilizes the data mobility capabilities built natively into the application workload itself. ... Technique: Some applications offer proprietary data mobility features. These capabilities usually facilitate or assist with configuring backups or secondary storage. These applications then synchronously or asynchronously ensure that the secondary storage is valid and, when necessary, can be used without the primary copy. ... Block-level data migration is performed at the storage volume level. Block-level migrations are not strictly concerned about the actual data stored within the storage volume. Rather, they include file system data of any kind, partitions of any kind, raw block storage, and data from any applications. Technique: Block-level migration tools synchronize one storage volume to another storage volume from the beginning of the volume (byte 0) to the end of the entire volume (byte N) without processing any data content.

Read more here ...
Tejasvi Addagada

Empowering Digital Transformation through Data Strategy & AI Innovation | Data & Privacy Leader | Speaker & Author

1y

Always intriguing topic Kannan Subbiah, where can a trade off be made between data democratization, Privacy & Governance. Data democratization is a culture change and is based on a concept that enables easy access to data. The ease of availability and access to data enables direct and in-direct data monetization, however, it has to be on the heels of data privacy.

Like
Reply
CHESTER SWANSON SR.

Realtor Associate @ Next Trend Realty LLC | HAR REALTOR, IRS Tax Preparer

2y

Thank you for Sharing.

To view or add a comment, sign in

More articles by Kannan Subbiah

  • May 14, 2025

    3 Stages of Building Self-Healing IT Systems With Multiagent AI Multiagent AI systems can allow significant…

  • May 13, 2025

    How to Move from Manual to Automated to Autonomous Testing As great as test automation is, it would be a mistake to put…

  • May 12, 2025

    The rise of vCISO as a viable cybersecurity career path Companies that don’t have the means to hire a full-time CISO…

  • May 11, 2025

    The Human-Centric Approach To Digital Transformation Involving employees from the beginning of the transformation…

  • May 10, 2025

    Building blocks – what’s required for my business to be SECURE? Zero Trust Architecture involves a set of rules that…

  • May 09, 2025

    The CIO Role Is Expanding -- And So Are the Risks of Getting It Wrong “We are seeing an increased focus of…

  • May 08, 2025

    Security Tools Alone Don't Protect You — Control Effectiveness Does Buying more tools has long been considered the key…

    1 Comment
  • May 07, 2025

    Real-world use cases for agentic AI There’s a wealth of public code bases on which models can be trained. And larger…

  • May 06, 2025

    A Primer for CTOs: Taming Technical Debt Taking a head-on approach is the most effective way to address technical debt,…

  • May 05, 2025

    How CISOs can talk cybersecurity so it makes sense to executives “With complex technical topics and evolving threats to…

Insights from the community

Others also viewed

Explore topics