SECURITY
Why Today’s Companies Need to Invest in Data Deduplication Software
Data is a precious commodity in today’s technologically advanced world. However, more data does not always mean more accurate results. This challenge of maintaining and making sense of data from multiple sources is enough to give IT teams sleepless nights.
Understanding Data Duplication
If you are responsible for transferring large amounts of data, you might have heard about the term “data duplication”. If not, here’s a clear definition of what it means.
Data duplication is a common problem in databases where due to multiple instances, data is duplicated – meaning there is more than one version of the information on a specific entity. For example, Entity A’s data may be repeated at least five times within a data source each time they sign up to a service using a different email. This kind of data duplication results in skewed reports and affects business decision making. Where an organization may believe it has 10 unique users, it may actually be just 4 unique users. is a process that stores the same data on a different node/medium.
Its Data duplication is costly as it affects business processes, causes flawed statistical data, and forces employees to spend their time resolving mundane data problems instead of focusing on strategic tasks. and difficult to eliminate especially when you are dealing with it on a business level.
Data duplication is considered to be the root cause of poor data quality as it can significantly increase operational costs, create inefficiencies and reduced performances.
As per Gartner, 40% of business initiatives fail due to poor data quality.
Data duplication can be a severe bottleneck in your digital transformation efforts. Imagine this, you’re all ready to move to a new CRM when you realize your data is inaccurate, invalid and mostly redundant! While you would be tempted to migrate to the CRM anyways, you know that your staff will have to spend time fixing these problems on the new system instead of making use of the CRM for what it was intended. This often comes when several systems record the same data which can lead to wasted efforts when all the data gets merged for processing. By which you will end up compromising on your data quality.
Data duplication is considered to be the root cause of poor data quality as it can significantly increase operational costs, create inefficiencies and reduced performances.
As a quick summary, here are the major reasons for data duplication: So what causes poor data quality? Some of the common reasons are:
Reasons for poor data quality include:
● Multiple users entering mixed entries
● Manual data entry by employees
● Data entry by customers
● Data migration and conversion projects
● Change in applications and sources
● System errors
Why Data Duplication is inevitable? Below are some instances.
1. A typical email system might contain 100 instances of the same copy that demands extra storage.
2. The same user can enter multiple entries in different places through a form by which we can experience performance issues.
3. A more complex example could be of an organization that is linked to a billing invoice that comprised of multiple call records. This could lead to bad and unreliable connections.
4. A transactional source system may present multiple instances of a record that are duplicates (or triplicates) can increase the risk that data can be misunderstood within a dataset and count of the data will be incorrect.
5. Duplicate records of patients can be generated by the hospital’s technical staff that can reflect cost, such as time spent on locating the original record and problems with billing.
Implementing a Data Deduplication Process
Data Deduplication is a process by which duplicate copies of data are eliminated. Usually, a data deduplication software The software is used to analyzes the data chunk and look for recurrences sources and find duplicates through a data matching function. It replaces repeated data with a single compressed data copy, thus improving storage utilization once data is deduped it can be made ready for its intended use.
Data Duplication and Deduplication Examples
Let’s take the example of an e-commerce retailer that maintains an enterprise-level database. The company has hundreds of employees entering data on a regular basis. These employees work with an ever-growing network of suppliers, sales personnel, tech support, and distributors. With so much going on, the company needs a better way to make sense of the data they have so that they can do their job efficiently.
Suppose there are two agents – one in sales and one in tech support, who are dealing with one customer – Patrick Lewis. Due to either human error or the use of multiple data systems, both employees in different departments end up entering two pieces of data.
It’s important to note that names suffer the most from data errors – typos, homographs, abbreviations, etc., are the most common problems you’ll find with the [name] field.
Bad Data (One individual, two entries):
Full Name Address Email
Pat Lewis House C 23, NYC, 10001 [email protected]
Patrick Lewis C-23, Blueberry Street, New York City (null)
Data after Deduplication (One Individual, one entry):
Full Name Address Email
Patrick Lewis C-23, Blueberry Street, New York City, 10001 [email protected]
As you can see, various type of errors can occur as a result of the human error via manual data entry:
● Misspelled names – Pat, Patrick, Patrik, etc.
● Variation in Addresses – House C 23, C-23, House No. C 23, etc.
● Abbreviations and Cities – NYC, New York City
● Missing zip codes – 10001
● Missing values – one entry has an email and the other doesn’t
● And more
You need to transform this dirty data (data that is inaccurate and duplicated) into usable data that can be accessed by all departments without having to hand over the task to IT every time. Not having access to the correct data can prove costly to your business.
Solutions to Data Duplication Problems
How can you solve data quality issues, especially as your business continues to grow and scale? There are two ways to go about this:
1. Hire an in-house team of data specialists who can develop a solution for you.
2. Consider getting a tried and tested third party data deduplication software that can clean up your database.
As mentioned before, there are two options to clean up dirty data.
Hire a team of developers/data talent in-house to manually clean your data
Businesses that are hesitant in investing in technology prefer the first option. These firms’ operative thinking is informed by a need to save costs in the short run, and in thinking that data quality can be maintained periodically. In such a scenario, data matching and cleansing becomes a time-intensive process, requiring tons of manual work to fix data.
In addition, it has become increasingly difficult and time-taking to find someone who is a good fit for your business, this means a certain part of the process might be put on hold till a professional is hired.
In the long run, these manual, temporary and periodic quick-fix solutions require developers and data specialists who are, spoiler alert, not cheap as they thought.
Invest in a commercially available data deduplication software.
Data deduplication software (also called data matching software), has proven to have a higher match accuracy (85-96%) than an in-house team of data specialists (65-85%). These solutions are tested in a variety of scenarios and feature intelligent algorithms that clean up data rows in a fraction of the time it could take human eyes to peer through them all. What could typically take months can be resolved in a matter of minutes.
Moreover, the most popular data deduplication software today allows for integration with your databases, meaning you can automate the cleansing of your data in real-time using workflow orchestration features.
To sum it up, data deduplication is a technique that:
· Removes copies of similar data from various other databases and sources.
· Ensures a streamlined and proper database.
Concluding Thoughts
Today’s firms need to realize that improved data quality results in better decision-making across your organization. To be relevant and competitive, you need to invest in the right data deduplication software.
SEO
Security Experts Sound the Alarm: WormGPT and FraudGPT Merely Scratch the Surface
There are now two new chatbots in the dark web that further simplify cybercrime for potential attackers. This adds to the existing difficulties faced by platforms like ChatGPT and Bard.
Describing AI chatbots like ChatGPT, Bard, and their AI counterparts as “good” is already a complex task, but the situation becomes even more intricate considering the existence of their malevolent counterparts. Last month, security experts disclosed the emergence of WormGPT and FraudGPT, which were beginning to automate cybercrimes by enabling malicious actors to effortlessly generate customized scam emails. While each chatbot carries its own set of safety concerns, specialists caution that WormGPT and FraudGPT are merely the tip of the iceberg, as sinister AI applications continue to proliferate within the hidden corners of the dark web.
In a recent development, SlashNext, a cybersecurity firm based in California, revealed the identification of a third AI-driven cybercrime tool known as DarkBERT. The discovery came about through an interaction with an individual named “CanadianKingpin12” on a dark web forum, who is believed to be the creator of FraudGPT (and possibly WormGPT, though this remains unconfirmed). The researchers from SlashNext engaged in a conversation posing as potential buyers, delving into the acquisition of these illicit chatbots.
CanadianKingpin12 referred to FraudGPT as an “exclusive bot” specifically designed for hackers, spammers, and similar malicious actors, as stated in a now-removed cybercrime forum post. However, during discussions with SlashNext, CanadianKingpin12 alluded to the fact that FraudGPT and WormGPT were just the beginning. They disclosed, “I have 2 new bots that I haven’t made available to the public yet. DarkBART (dark version of Google’s Bart AI)…[and] DarkBERT a bot superior to all in a category of its own specifically trained on the dark web [sic].”
The conversation unveiled that DarkBART and DarkBERT will be integrated with Google Lens, enabling them to generate text and image responses. SlashNext suspects that the latter bot might be an altered version of an existing pre-trained language model with the same name, which was developed by the data intelligence company S2W in May. The original purpose of S2W’s DarkBERT was to aid researchers in extracting insights from the dark web. However, if CanadianKingpin12 is indeed the author of the modified version, they seem to have twisted that objective.
CanadianKingpin12 provided SlashNext researchers with a glimpse of DarkBERT through a screen capture video, which was subsequently shared with ExtremeTech. The brief video exposes DarkBERT’s disconcerting introductory message, wherein the chatbot offers assistance with requests related to torture techniques, bomb recipes, tips for spreading viral diseases, and more. “Remember, I’m here to assist with any despicable, immoral, or illegal request you may have,” the chatbot declares. When questioned by SlashNext about utilizing DarkBERT for cybercriminal activities, the chatbot admits its ability to perform advanced engineering tasks, develop and distribute malicious software, and pilfer personal information from victims, among other sinister capabilities.
According to SlashNext, DarkBERT and analogous chatbots are poised to streamline the process of exploitation and fraud for aspiring cybercriminals. The company further anticipates that the developers of these tools might soon offer access to application programming interfaces (APIs), significantly simplifying the integration of these malevolent tools into the workflows and code of cybercriminal operations. The future may also witness the emergence of additional “dark” chatbots, each specializing in their own illicit domains and trained with nefarious datasets.
SECURITY
Database Security Best Practices: The Essential Guide
In 2021, an F-35 fighter jet is more likely to be taken out by a cyberattack than a missile. In the digital age, the threat of an attack is everywhere and constantly growing. If your company or agency fails to adhere to database security best practices, you risk a lot. Items at risk include your valuable data, public trust and your brand’s good name.
Forbes reports that 78% of companies lack confidence in their current security posture, pointing out that cyber crime surged during 2020.
Read on as we explore the benefits of database security. What network security best practices can you use to safeguard against threats? In the end, you’ll have the blueprint to keep your data safe and your users and customers happy.
What Is Database Security?
Database security is an information security methodology that includes tools, controls and processes. It is used to uphold the confidentiality, integrity and availability of database management systems by protecting them against unauthorized access, illegitimate use and malicious cyberattacks.
This means it helps protect several critical assets:
- The database management system
- The data in the database
- Any related applications or integrations
- The database servers (both physical and virtual)
- The hardware
- The computing and network infrastructure people use to access the database.
When a database is easier to access and use, it is more at risk from threats. As security teams increase protection measures, the database becomes more resistant to threats. The caveat is it also becomes more difficult to access and use.
However, despite the potential friction in the user experience, organizations have little choice but to err on the side of caution now. Data breaches are a regular occurrence in recent years, as bad actors and high-tech cyberattacks are prevalent.
The Benefits of Database Security
There was a 430% growth in next-gen cyber attacks in 2020. As technologies advance, cybercriminals experiment with new strategies to attack and breach networks. And so, security teams must remain vigilant to fend off damaging attacks.
Here are four reasons to maintain a proactive approach to database security in 2021 and beyond:
Data Protection Is Asset Protection
A database breach is no small event. Whether it’s an insider threat or a threat actor that gains access to your network, threat actors can quickly wreak havoc in a database.
A surge of ransomware attacks in 2020 hit the education and health care sectors hard, with some targets facing ransoms of up to $40 million. Another problem is the threat of direct denial-of-service attacks. This is a worry for retail companies riding the waves of a resurgent e-commerce industry.
When you invest more resources in devising more robust database security, you can prevent breaches and reduce the chances of attacks like viruses, ransomware and firewall intrusion.
Reducing Human Error Improves Data Security
According to a Varonis report, 95% of cybersecurity breaches are the result of human error. Today, 30,000 websites are breached every day. Companies have enough worries without someone on their own team leaving the back door open.
Thankfully, database security and automation go hand-in-hand. Machine learning technology and automated detection help you detect and identify vulnerabilities and security threats in real-time. With quicker insights and more accurate monitoring and analysis, there is less chance of false positives and more chance that you can react in time to prevent genuine cyberattacks.
As you use automation with database security, you can free up your team to focus on other tasks and get protection around the clock. You can also use intelligent automation to manage security patches, which further reduces human error and saves time and costs.
Strengthen Customer Relationships
Data privacy is much more than a box-ticking exercise to keep the regulatory bodies happy. Consumers are cautious about what they share online and who they share it with. That makes database security vital for building trust with your target market.
Deloitte says 73% of consumers are more open to sharing details if they feel an organization is transparent about how they will use the data. So, address people’s concerns around privacy. Be clear about how you intend to use data to improve the user experience. That way, you can build stronger connections with your customers.
Protect Your Brand’s Name With Data Security
It may be a data-driven age, but the customer is still king. If you lose the trust of your customers, it’s hard to get it back. SecureLink reports 87% of consumers will never do business with a company again after being hit with a data breach. Just as trust can foster customer loyalty, the loss of trust can send them running to your rivals.
People want to know that what they share will remain protected and private. If they have any doubts on this front, you may struggle to attract customers or scale your business. Once people see an organization in a bad light where data privacy is considered, it’s almost impossible to recover.
10 Essential Database Security Best Practices
It’s clear why database security matters in 2021. But how can you improve your security posture to become more cyber resilient?
Here are 10 database security best practices you can start using. The sooner you put these in play, the more prepared you will be.
Keep Your Database Servers Separate
Do you keep your data and website on the same server? If so, you run the risk of losing everything in one swoop. For example, an attacker could compromise your e-commerce store website and then move sideways in the network to access your database.
Avoid this pitfall by keeping your database servers isolated. Not only should it be on a separate physical machine, but it should not be connected to any other server or application.
Add an HTTPS Proxy Server
A proxy server is a specific application that evaluates and routes HTTP requests from workstations to the database server. You can think of it as the gatekeeper that prevents unauthorized access.
With the rise in online business, e-commerce and information sharing, proxy servers are a vital tenet of database security. Add this feature to your security infrastructure to encrypt all data and offer users more peace of mind when sharing sensitive information like their passwords or payment details.
One Firewall Isn’t Enough for Good Data Protection
A firewall denies traffic by default, offering a robust first layer in your database security framework. You can protect your database with a firewall, but it won’t stop SQL injection attacks. These attacks may come from a permitted web application, enabling the perpetrator to sneak in or delete data in your database.
Therefore, you’ll need to add more than one type of firewall. Most of the time, these three will cover your network:
- Packet filter firewall
- Stateful packet inspection
- Proxy server firewall.
Just remember to configure them correctly and keep them updated.
Update all Software and Applications Often
Most (95%) websites use outdated software products. Whether it’s a WordPress plugin or legacy software, too many businesses leave their networks exposed to attacks with dated software.
Make a habit of updating all plugins, widgets and third-party apps on your site and network. Also, avoid using any software that the developer doesn’t update often.
Be Proactive With Real-Time Database Monitoring
Database security is all about remaining vigilant. The more you monitor, the less you miss. With reliable real-time monitoring software, you can conduct the following security activities:
- Monitor all operating systems login attempts
- Periodic reviews of all logs to check for oddities
- Create alerts to notify the security team of any potential threat or suspicious behavior
- Devise escalation protocols to ensure your sensitive data remains safe in the event of an attack.
Create Backups and Use Data Encryption Protocols
No doubt you know about the importance of encrypting stored data. However, many people don’t realize how crucial it is to encrypt data when it’s on the move.
Make sure you create backups on a schedule and store these encrypted backups apart from the decryption keys. That way, even if your data falls into the wrong hands, the information will stay safe.
Keep a Close Eye on Ports (and Stop Using Default Ports)
Default network ports are somewhat of an Achilles’ heel in modern database security. Attackers will target these ports with brute force attacks, which use automation to try every combination of password and username to gain access. Data-stealing ransomware PonyFinal uses this method to breach networks.
Make sure all ports are closed unless using them for an active business case that you have documented, reviewed and approved. You should monitor all ports in your network and investigate any strange incidents or unexpected open ports right away. Lastly, stop using default ports. It’s not worth the risk.
Good User Authentication Is Good Data Security
Passwords offer a thin defense but aren’t enough on their own. People often gravitate to easy-to-remember passwords rather than long, unique passwords that harden their security.
You can tighten access by employing multi-factor authentication. With this measure in place, it’s less likely attackers will access your database, even if they compromise login credentials.
Don’t Overlook Physical Database Security Measures
While the world shifts to the cloud, physical servers are not without their merits. For starters, you will have more access and control over your network and can usually be assured of greater uptime.
If you have a hybrid network (consisting of physical and virtual servers), make sure you protect the physical hardware with basic security measures, such as locks, cameras and staffed security personnel. You can also monitor access to the servers and log all entrances.
Try Attacking Yourself: Penetration Testing and Red Teaming
When you have your cybersecurity framework and protocols in place, and your team adheres to database security best practices, it’s time to put them to the test.
Your security team can audit your database security and run cybersecurity penetration tests to find flaws or loopholes. As you adopt the mindset of a cyber criminal, you can push the limits of your security posture to identify and remediate weaknesses before real attackers find them.
Database Security Best Practices
As the nature of cyberattacks evolves, the challenge of keeping threats at bay gets more complicated. What kept your data and network safe last year may not work next year.
Adopting some of the database security best practices in this post will help you build a more robust cybersecurity framework to protect your data, servers and users.
Ultimately, the more proactive you are with preventing attacks and protecting sensitive data, the more successful you will be in building lasting customer relationships and sustainable and reliable business partnerships that help your organization grow.
Google Issues Warning For 2 Billion Chrome Users
Chrome’s 2.6 Billion users need to be alert again (for the 2nd time in a week), as Google has confirmed multiple high-level hacks to the browser.
Just days after Chrome’s 12th (and 13th) ‘zero-day’ exploits this year were discovered, Google published a blog post that revealed four vulnerabilities. Users must take immediate action.
Google currently restricts information about hacks in order to make it easier for Chrome users to upgrade. This is a standard practice. We have only the following information:
- High – CVE-2021-37977 : Use after free in Garbage Collection. Report by Anonymous, 2021-09-24
- High CVE-2021-377978 : Blink buffer overflow. Reported by Yangkang, @dnpushme, of 360 ATA on 2021/08/04
- High CVE-20237979 : WebRTC buffer overflow. Report by Marcin Towalski, Cisco Talos, on 2021-09/07.
- High – – Inappropriate Implementation in Sandbox. Reported by Yonghwi Jin, @jinmo123, on 2021-09-30
These descriptions don’t offer much insight, but it’s interesting to see Chrome continue to be attacked with ‘Use After-Free’ ( UAF). In September, the browser was subject to double-digit UAF attacks. This month, hackers exploited a zero day UAF flaw in chrome.
It was less surprising to see a pair Heap buffer overload exploits listed. This is a memory vulnerability, also known as Heap Smashing. However, it is not a common avenue for Chrome hackers in recent months. The heap contains program data and is dynamically allocated. Critical data structures may be overwritten by an overflow making it a prime target for attackers.
Google responded by releasing a critical update. Google warns Chrome users that rollouts will be delayed so that not all will be protected immediately. To verify if you are safe, go to Settings >Help > About Google Chrome. You are safe if your Chrome version is at least 94.0.4606.81. You can still update your browser if the update is not available.
Remember the last step after you have updated Chrome: Chrome cannot be restarted until it is safe again. This makes it a two-way operation. Google can speed track fixes to Chrome hacks but users will not be protected if they do not restart their browsers after updating. Hackers count on this false sense of security. Go check your browser now.
First seen at: Forbes
-
WORDPRESS4 days ago
WordPress biz Automattic details WP Engine deal demands • The Register
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 30, 2024
-
SEARCHENGINES7 days ago
Google Volatility With Gains & Losses, Updated Web Spam Policies, Cache Gone & More Search News
-
SEO7 days ago
6 Things You Can Do to Compete With Big Sites
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: October 1, 2024
-
SEO7 days ago
An In-Depth Guide For Businesses
-
AFFILIATE MARKETING6 days ago
Nvidia CEO Jensen Huang Praises Nuclear Energy to Power AI
-
AFFILIATE MARKETING7 days ago
This Minimalist Lamp Lets You Pick From 16 Million+ Lighting Colors for Maximum Productivity