Earth Day feels like the right time to talk about technology waste, not because I am particularly sentimental about the occasion, but because most professional services firms are sitting on a device lifecycle management problem that is quietly costing them money. Nobody is talking about it in those terms.
I am also going to be honest about something upfront: sustainable technology practices are good for the environment, but I have never once convinced a business to change its approach to hardware solely for environmental reasons. What actually moves the needle is the operational and financial argument. The good news is that the same decisions that reduce e-waste also reduce costs and risk. So the environmental benefit is, in this case, the bonus.
Why Device Lifecycle Management Is a Business Problem First
Most professional services firms I work with do not have a formal device lifecycle management policy. What they have is a replacement habit: when a computer stops working acceptably, or when a staff member complains loudly enough, a new one gets purchased.
The result is an office full of machines of wildly different ages and configurations. Some are running operating systems that are no longer receiving security updates. Some are brand new. Most have not been inventoried in years. That is a security problem as much as an environmental one, and it is also expensive in ways that do not show up on any single invoice.
A reasonable device lifecycle for business computers is three to five years, depending on the workload. Below that range, you are replacing hardware before you can extract reasonable value from it. Above it, you are running machines that are slower than they should be, less secure than they need to be, and more likely to fail at an inopportune time. The operating cost of an aging machine in support time, productivity loss, and security risk tends to exceed the cost of replacement well before the hardware visibly gives out.
Responsible Workstation Setup Includes Planning What Happens at End of Life
When a device reaches the end of its useful life at your firm, a few steps need to be taken before it goes anywhere.
Data must be wiped, not deleted. Wiped. Deleting files does not remove them from a hard drive in a way that prevents recovery. A proper wipe overwrites the storage, making recovery practically impossible. If you are sending devices to a recycler or donating them, this step is not optional. Your clients’ data has been on those machines.
Devices that are still functional but no longer appropriate for primary staff may have a second life. Many nonprofits and schools accept used business equipment. If the device has been properly wiped and is running a current operating system, it can provide meaningful value elsewhere rather than going straight to a landfill.
For devices genuinely at the end of life, find a certified e-waste recycler. Most municipalities in Southern California have periodic e-waste collection events. A certified recycler ensures that the materials inside, some of which are genuinely hazardous if handled carelessly, are processed correctly.
Technology Planning for Business Growth Means Replacing Reactively Less Often
One of the most useful things a professional services firm can do, for both its operations and its environmental footprint, is move from reactive device replacement to planned refresh cycles.
Practically, this means knowing what hardware you have, when it was purchased, and when it is due for replacement. A simple spreadsheet works. When you know three years in advance that a wave of machines will need replacing, you can budget for it, plan the transition, and avoid the operational disruption of emergency replacements during busy periods. Tax season is a terrible time to discover that a staff member’s computer has finally given out.
It also means you stop buying machines during crisis conditions, which is almost always when the worst purchasing decisions are made. When the controller’s computer dies the week before a filing deadline, you buy whatever is available and ship it overnight. When you plan a refresh 12 months out, you have time to evaluate what your staff actually needs and buy accordingly. That is both better technology planning for business growth and considerably less expensive.
The Software Side of Sustainable Technology
Physical hardware is not the only place where waste accumulates. Software subscriptions are the other.
Most firms are paying for licenses they are not using, for platforms that have been partially replaced by something else, or for features within a platform that nobody has ever turned on. A software audit, a straightforward review of what you are subscribed to, who is using it, and whether the cost is justified, is something most firms have never done systematically.
It is not a complex exercise, and it consistently identifies funds that can be reallocated to what actually matters. I have never done one for a client and come up empty.
The Practical Starting Point
If you want to do something concrete this month that addresses all of the above, take an inventory. Pull together a list of every computer, laptop, and tablet used in your firm, when it was purchased, and who uses it. If you do not know when something was purchased, a good IT partner can usually determine that from the device’s system information.
Once you have that list, you have the information you need to make actual decisions about device lifecycle management, rather than just reacting to the next thing that breaks.
If you would like help pulling that inventory together or thinking through a refresh and workstation setup strategy, reach out. It is a straightforward conversation, and the starting point is almost always simpler than people expect.
Quick and Easy: Most professional services firms lack a device lifecycle management plan, which means they replace hardware reactively under pressure, run aging machines that pose security risks, and generate more e-waste than necessary. Moving to a planned three-to-five-year refresh cycle, properly wiping devices before retirement, and auditing unused software subscriptions addresses all three problems at once and often saves money.
Two years ago, that sentence would have sounded like paranoid fiction. It does not sound like that right now.
I want to be clear upfront: I’m not here to argue politics. I genuinely do not care which side of the DOGE debate you’re on. What I do care about is that the data situation quietly unfolding within the Social Security Administration has real consequences for your business, your employees, and your clients, and most people are not paying attention.
Let me explain what happened, and more importantly, what it means for you specifically.
What Actually Happened
The Department of Government Efficiency, working inside the Social Security Administration, allegedly copied the entire NUMIDENT database to a cloud environment that bypassed the agency’s standard security protocols. According to a whistleblower complaint filed by the SSA’s former chief data officer, Charles Borges, this was done despite court orders limiting DOGE’s access to the agency’s systems.
The NUMIDENT is not just Social Security numbers. It is every record ever submitted in an application for a Social Security card: names, dates of birth, citizenship status, race and ethnicity, phone numbers, home addresses, and parents’ names and Social Security numbers. For more than 300 million Americans.
Court filings later revealed that DOGE employees used a third-party Cloudflare server not approved for SSA data, sent a password-protected file containing private records to outside affiliates, and that the SSA still cannot fully account for what was left in its systems or where it went. The Department of Justice has acknowledged in court filings that earlier statements about the scope of access were inaccurate.
Borges, per his complaint, warned his superiors that the agency might one day be forced to reissue every Social Security number in the country. A Senate investigation put the risk of a catastrophic breach at 65 percent.
Why This Is Different from Every Other Breach
Most data incidents involve something replaceable. Credit card compromised? You get a new one. Password exposed? Reset it. Account hacked? Recover it.
A Social Security number does not work that way. It is the root credential for your credit history, your tax filings, your employment verifications, your professional licenses, your Medicare records, and your background check history. Getting a new one, in the rare cases the SSA permits it, creates nearly as many problems as it solves, because nothing else in your financial life knows about the change.
If this data ends up in the wrong hands, the damage will not look like a fraud alert next week. It looks like a suspicious loan application two years from now or a tax return filed in your employee’s name before they can file their own. It could look like a wire transfer request that sounds exactly like your CFO, because someone has enough personal details to make it convincing.
The Three Business Risks Worth Taking Seriously
Your employees are now higher-value social engineering targets. If bad actors have an employee’s SSN, home address, employer, and parents’ names, they can construct pretexts that are genuinely hard to detect. Not a generic phishing email. A targeted call that opens with information that sounds like insider knowledge. Professional services firms, where staff regularly handle client funds and sensitive documents, are exactly the kind of target that makes this worthwhile for a criminal.
Your clients are downstream of whatever happens to your team. Accounting firms, law offices, and property management companies hold sensitive financial and personal data on behalf of other people. If an employee identity compromise creates an intrusion into your systems, your clients have a problem too. The liability runs in both directions and it runs fast.
The verification systems your business relies on may become unreliable. If large-scale SSN fraud materializes from this exposure, financial institutions will respond by tightening verification processes. Credit applications, employment checks, and background verifications may get slower, more expensive, or more complicated across the board. That is an operational headache even for firms that do not experience a direct breach.
What You Can Actually Do
None of this requires an expensive platform purchase or a consultant’s SOW. It mostly requires an afternoon and some attention.
Tell your team what happened in plain language. Informed employees are harder to manipulate. A staff that knows their personal data is out there is less likely to be fooled by a pretext that uses it.
Encourage everyone to freeze their credit at all three bureaus. It is free, it is reversible when needed, and it is still the most effective individual defense against identity fraud available. Experian, Equifax, and TransUnion all allow you to do it online.
Set up an alert through ssa.gov so you receive notification if anyone attempts to access Social Security benefits using your number.
Review your cybersecurity insurance policy for social engineering coverage specifically. Many policies cover breaches of company systems but have lower limits, or outright exclusions, for employee identity compromise that creates a business loss. Find out before you need to know.
If your firm does not have a written process for what to do when an employee reports identity theft, write one. It does not have to be long. It just has to exist before you need it.
The Bigger Picture
I have written before about the way cybersecurity threats have become environmental. They are not targeted at you specifically. They are more like pollution: pervasive, ongoing, not always visible, and best managed through preparation rather than reaction.
What makes this particular situation harder is that the exposure did not come from a criminal enterprise. It came from inside the institutions we were told to trust with our most sensitive information. That is a more uncomfortable conversation. But avoiding it does not change the exposure.
The firms that handle this well are not the ones with the most sophisticated tools. They are the ones that thought through what they would do before something went wrong, rather than figuring it out in the middle of it.
If you want to talk through what your firm’s actual risk picture looks like right now, reach out. That conversation is always free.
Quick and Easy: DOGE allegedly copied the Social Security Administration’s entire national database to an unauthorized cloud server, and the agency’s own cybersecurity officials raised the possibility of having to reissue every SSN in the country as a worst-case outcome. For professional services firms, the real risks are targeted social engineering of your employees, downstream exposure of your clients, and potential disruption to financial verification processes. The practical responses are mostly free and can be put in place this week.
Tax season is the best stress test your technology will ever get. And it is completely free. You did not ask for it, you cannot opt out, and every year between January and April your systems will tell you exactly where the cracks are. The question is whether you are paying attention.
I work with accounting firms as managed IT clients, and I have worked with several more over the years. The pattern is consistent enough that I could describe it before the season starts: the issues that barely registered in November become full-blown crises in March, usually at the worst possible moment, because that is what technology is reliably good at.
Why Tax Season Is the Real Measure of Your IT Support for Accounting Firms
The most common issues that surface during peak filing season are not new problems. They are old problems that finally got loud enough to demand attention.
Slow systems are the most common complaint, and the cause is almost never a mystery. Machines that are three or four years old, running software that has grown steadily more demanding, start struggling under the weight of high-volume processing. The firm has lived with the sluggishness for months because it was tolerable. In March, when everyone is working longer hours and deadlines are immovable, tolerating it is no longer an option.
Remote access failures are the second most common issue. Hybrid teams that work fine under normal conditions hit their limits when everyone is remote simultaneously and the VPN was never sized for that load. Or a staff member is working from home on a personal device with outdated software that creates compatibility problems with cloud-based tax platforms.
Cloud platform slowdowns round out the top three. Accounting firms run on software like Lacerte, CCH, UltraTax, or Drake. When those platforms slow down or have service interruptions during filing season, it is not just inconvenient. According to one analysis, a single hour of downtime at a ten-person firm with a $200 average billable rate can cost over $1,000 in lost productivity and that does not count the backlog that builds, the client frustration, or the staff morale hit.
What Tax Season Actually Reveals About Professional Services Technology
Beyond the specific failures, tax season exposes something more fundamental: whether your firm has professional services technology built for how you actually work, or built for how you worked five years ago.
An accounting firm with no coherent IT support plan tends to normalize the warning signs until they stop feeling like warning signs. Work slows and nobody identifies why. Staff develop workarounds for software that does not behave reliably. Files end up saved in inconsistent locations because nobody established a protocol. None of these are catastrophic on their own, but under peak-season pressure, they compound.
The other thing tax season reveals is your security posture. Accounting firms are high-value targets because they hold a concentration of financial data that is genuinely valuable to criminals. Firms in regulated states like California face stricter data privacy requirements than many owners realize. A ransomware attack the week before the April deadline is not a hypothetical scenario for accounting firms. It happens.
Workflow Optimization Starts with Honest Post-Season Analysis
The instinct after surviving a rough tax season is to exhale, finish the remaining client work, and deal with technology problems later. I understand that instinct. Unfortunately, later tends to become next January, when you are headed into the same situation again.
A post-tax-season review does not have to be comprehensive or expensive. A few honest questions are a reasonable place to start.
What specifically slowed down or broke during the season? Write it down while it is fresh. “The system felt slow” is less useful than “CCH was taking four minutes to load on Maria’s machine starting around March 10.”
Were there any near-misses? Security alerts, unusual login attempts, or phishing emails that someone caught? Those matter too.
What workarounds did your team create? Workarounds are symptoms. They tell you where the official process broke down, which is exactly where your IT attention should go next.
If you have a managed IT partner, share that list with them. If you do not, and your tax season was rougher than it needed to be, that list is a good starting point for a conversation about what a proactive approach to IT support for accounting firms actually looks like.
The goal is not to over-engineer your environment. It is to make sure the systems your firm runs on are built for the way you actually work, not just adequate for a slow Tuesday in October.
Quick and Easy: Tax season reliably surfaces every technology problem your accounting firm has been tolerating, from aging hardware to under-sized VPNs to security gaps, because pressure turns inconveniences into crises. The firms that come out ahead are the ones that treat the post-season debrief as useful data instead of something to forget as quickly as possible. Write down what broke, what slowed down, and what workarounds your team created, then fix those things before next January.
You just don’t know it yet.
I had a conversation recently with a client that stopped me cold. One of their employees had been using a paid AI chatbot to help with administrative work. She was saving herself hours a day. She was sharp, resourceful, and genuinely proud of what she figured out on her own. Unfortunately, she had absolutely no idea she had been feeding client data into a third-party system that her company had never reviewed, approved, or consented to on behalf of the people whose information she was sharing.
When I asked her point blank, “Are you putting client data in there?” she said yes. Then, when I explained what that actually meant, she was horrified. Not because she did something malicious. Because she had no idea there was anything to be horrified about.
That’s the conversation I keep having right now, and I think a lot of business owners need to hear it.
The Part Nobody Explains
What most people do not understand about AI tools is that when you type something into a chatbot, that information does not necessarily stay with you. Depending on the platform, the service’s terms of use, and whatever privacy settings exist in your account, that data may be used to train the model. It may be retained. It may be stored on servers you have no visibility into.
Now, I am not here to tell you that every AI company is doing something sinister. Some are genuinely more careful than others. However, even the most responsible provider operates under a simple truth: unless the platform explicitly states it will not use your data for training purposes, and unless your clients have given you consent to share their information with that platform, you are operating in a gray area.
In professional services, gray areas often become very expensive problems.
The Real Risk for Accounting Firms, Law Offices, and Property Managers
Think about what your employees handle: client financials, legal correspondence, lease agreements, Social Security numbers, medical expense records, and attorney-client communications. This is not generic business information. This is sensitive, regulated, and in many cases privileged data.
Sharing that information with an AI tool, even to do something as mundane as drafting a summary or cleaning up a spreadsheet, is a data-sharing event. The fact that it feels like a productivity shortcut does not change what it actually is.
Cyber insurance carriers are already paying attention to this. Compliance frameworks are catching up. When something goes wrong, the fact that the employee “didn’t know” is not going to satisfy the client whose information ended up somewhere it was never supposed to be.
What I Tell My Clients to Do Right Now
You do not need to ban AI tools. I am not suggesting that. Some of them are genuinely useful and, in the right context, safe. However, you do need to stop pretending this is not happening in your office.
Start with a basic policy. It does not have to be long. It does not have to be complicated. It should answer three questions: which AI tools are approved for use, what categories of data can and cannot be entered into those tools, and who is responsible for reviewing and updating that guidance as things change. Because they will change, probably faster than any of us would like.
Then you need to have the conversation. Not a scary, disciplinary conversation, but a practical one. Most employees using these tools are doing so to do their jobs better. They deserve to understand the actual risks so they can make informed decisions, not get caught off guard as my client’s employee did.
A Word on the AI Companies Themselves
I get asked a lot about which AI providers are the most trustworthy. Honestly, that question is harder to answer than it sounds. This space is constantly shifting, and companies that have solid policies today often quietly revise them later.
What I tell people is this: do not base your data-handling decisions on trust alone. Base them on what the agreement actually says, what your compliance requirements demand, and whether you have any business reason to take on the risk. Copilot, for example, operates within Microsoft’s walled environment, which at least limits where your data can go. Even that is not a blank check to input anything and everything without thinking.
The honest answer is that we are all figuring this out as we go. Even me. The responsible thing is to proceed carefully, ask questions, and not assume that a productivity gain justifies a compliance violation.
Quick and Easy
Employees at professional services firms routinely enter client data into AI tools without understanding the associated privacy and compliance risks. A simple internal policy covering approved tools and prohibited data categories is not a luxury at this point. It is a basic part of running a responsible business.
The 3-2-1 backup rule is one of those things that IT people throw around like everyone should know what it means, and then they’re shocked when business owners look at them like they’re speaking ancient Greek. So let me explain it in plain English, because this rule is genuinely important for protecting your business data.
The rule is simple: you need 3 copies of your data, on 2 different types of media, with 1 copy stored off-site. That’s it. Three, two, one. If you follow this rule, your data will survive almost any disaster that could realistically happen to a small business.
The Three Copies
Three copies means your original data plus two backups. Not three backups. Three total copies. So if you have your files on your server, a backup on an external hard drive, and a backup in the cloud, you’ve got three copies. Your working data is copy number one.
Why three? Because technology fails. According to Backblaze’s hard drive reliability statistics, even the most reliable hard drives have an annual failure rate of about 1.5%. That sounds low until you realize that if you have 50 drives in your office over several years, you’re going to experience failures. Multiple copies ensure that when one fails, you’re not scrambling to recover from your only remaining copy.
This also protects you from the scenario where your backup itself gets corrupted. I’ve seen it happen. A backup runs every night for six months, looks completely normal, and then when you try to restore from it you discover the entire backup has been gradually corrupting and is now useless. If you only have one backup, you’re out of luck. If you have two backups, you’ve still got a good copy.
The Two Different Types of Media
Two different types of media means not storing all your backups the same way. If you have your data on a hard drive in your computer, and your backup on another hard drive in your computer, and your second backup on an external hard drive sitting next to your computer, you don’t actually have media diversity. You have three hard drives, all potentially vulnerable to the same type of failure.
Better would be: data on your computer’s hard drive, one backup on an external hard drive, and one backup in the cloud. Now you’ve got local storage and cloud storage. Different technologies, different failure modes. A power surge that kills your computer and external drive won’t touch your cloud backup. A cloud service outage won’t affect your local copies.
This is a critical part of any data loss prevention strategy. Different media types protect you from technology-specific failures. Seagate’s data recovery study found that 67% of data loss incidents affect multiple devices when they use the same storage technology, often due to environmental factors like power issues, temperature, or moisture.
The One Off-Site Copy
This is where most small businesses fail. They get the three copies part right. They might even get the two media types right. But they keep all their backups in the same building as their original data. Which means if that building burns down, floods, gets hit by ransomware, or experiences any other localized disaster, all your copies are gone at once.
Off-site means genuinely off-site. Cloud backup absolutely counts. A backup drive that you take home every week counts. A backup stored at a second office location counts. A hard drive in your office manager’s car does not count, which I’ll explain in another post.
According to the National Archives’ disaster statistics, 93% of companies that experience a significant data loss are out of business within five years. Off-site backup is your insurance policy against that statistic.
Why Professional Business Backup Services Follow This Rule
When you work with professional backup services for business, they implement the 3-2-1 rule automatically. Your data gets backed up locally for fast recovery, backed up to the cloud for off-site protection, and often backed up to multiple cloud locations for redundancy. You don’t have to think about it or remember to do it. It just happens.
This is also why restore services for small business are part of the package. Having three copies stored properly doesn’t help if you can’t actually get your data back when you need it. Professional services test the restores regularly to make sure all three copies are actually usable.
The 3-2-1 Rule for Different Business Sizes
For a solo practice attorney with one computer, this might look like: files on your laptop, continuous backup to an external drive, and automatic cloud backup to a service like Backblaze or Carbonite. Three copies, two media types, one off-site.
For a 15-person accounting firm, it’s more complex. Files on your server, backup to a network-attached storage device, and backup to cloud storage. Maybe also backup to tape drives that get rotated off-site weekly. The principle is the same, but the execution is more sophisticated.
The beauty of the 3-2-1 rule is that it scales. It works whether you’re protecting one laptop or a hundred servers. The specific technologies change, but the logic remains solid.
Quick and Easy
The 3-2-1 backup rule means keeping three total copies of your data, using two different storage technologies, with one copy stored off-site. This data loss prevention strategy protects professional services firms from hardware failure, disasters, and ransomware by ensuring redundancy and geographic distribution of backups.
Many businesses, when trying to get their processes in order, debate whether using Microsoft 365 or Google Workspace would work best for their needs. Although the business world tends to “expect” Microsoft applications, there are those who fully utilize Google.
Here’s the honest truth: both platforms are good. Both will handle your email, calendar, file storage, and collaboration needs. Both have gotten dramatically better in the past few years. And both will cost you roughly the same amount of money. So if you’re expecting me to tell you that one is objectively superior to the other, you’re going to be disappointed.
What I can tell you is which one works better for the specific ways that accounting firms, law offices, and property management companies actually work.
Where Microsoft 365 Wins
For law firms specifically, Microsoft 365 is usually the better choice, and the reason comes down to two things: document formatting and industry expectations.
Legal documents require precise formatting. Numbered paragraphs, specific indentation, complex tables, cross-references, and redlining that tracks every change made by every attorney who touches a document. Microsoft Word is still the gold standard for this kind of work. Google Docs has gotten better, but it’s still not quite there for complex legal documents. According to ABA’s 2024 Legal Technology Survey, 94% of law firms still use Microsoft Word as their primary document creation tool.
The second issue is client expectations. When you send a legal document to a client or opposing counsel, they expect to receive a .docx file. They expect to be able to open it in Word, make their comments using Word’s track changes feature, and send it back. You can absolutely do this workflow with Google Workspace, but it creates friction. You’re constantly converting files, worrying about whether formatting survived the conversion, and explaining to clients why your documents look slightly different.
Microsoft 365 also integrates better with practice management software that law firms use. Most legal-specific software was built with Microsoft in mind. The integrations are tighter, the compatibility is better, and you spend less time fighting with your tools.
Where Google Workspace Makes Sense
That said, Google Workspace isn’t a bad choice, and for some firms it’s actually the better option. If your firm is smaller, more nimble, and doesn’t have decades of document templates built in Microsoft Word, Google Workspace can be easier to manage and more intuitive for people who aren’t deeply technical.
Google Workspace setup is simpler than Microsoft 365 deployment. There are fewer moving parts, fewer configuration options, and less that can go wrong. For a 5-person law office that just needs email, calendars, and basic document collaboration, Google Workspace gets you up and running faster with less complexity.
Google’s collaboration features are also more intuitive. Multiple people can edit a document simultaneously, and it just works. With Microsoft 365, you can do the same thing, but it requires OneDrive and specific versions of Office apps, and there’s more that can go sideways.
The Real Cost Comparison
Price-wise, they’re comparable. Microsoft 365 Business Standard runs about $12.50 per user per month. Google Workspace Business Standard is $12 per user per month. You’re not making this decision based on a 50-cent difference. The real costs come from cloud migration support, training your staff, and potential productivity loss during the transition.
According to Forrester’s Total Economic Impact study, organizations switching platforms experience an average productivity dip of 15-20% for the first 2-3 months while people adjust. That’s the real cost you need to factor in. If you’ve been using Microsoft for 20 years, switching to Google isn’t just a technology change, it’s a workflow change.
What About Hybrid Approaches?
Some firms try to split the difference by using Gmail with Microsoft Office apps. This mostly works, but it creates its own complications. You lose some of the tight integration between email and calendar. File storage gets confusing when people aren’t sure whether to save things in Google Drive or OneDrive. And you’re paying for redundant services.
I generally don’t recommend hybrid approaches unless you have a specific technical reason that requires it. Pick one platform and commit to it fully. Your people will be happier, your IT management will be simpler, and you’ll spend less time troubleshooting weird compatibility issues.
Making the Decision
For most law firms and accounting practices I work with, Microsoft 365 is the right choice. The document compatibility, the industry standard status, and the integration with other professional services software outweigh the slightly steeper learning curve and more complex administration.
But if you’re a smaller firm, if you don’t have complex document formatting needs, or if you value simplicity over feature depth, Google Workspace is a perfectly viable option. The key is making the decision based on your actual workflow, not on what some article on the internet told you was “better.”
Quick and Easy
For law firms and accounting practices, Microsoft 365 is usually the better choice due to document formatting requirements and industry standard expectations. Google Workspace works well for smaller firms prioritizing simplicity, but both platforms require careful cloud migration support and training to avoid productivity loss.
I’ve been doing this for over three decades, and I can tell you with absolute certainty that most small business backup strategies are garbage. Not because people don’t care about their data. They do. But because backups are one of those things that everyone assumes is working fine until the moment they desperately need it, and then they discover it’s been broken for six months.
According to Veeam’s 2024 Data Protection Trends Report, 85% of organizations experienced at least one ransomware attack in the past year, but only 23% were able to recover all of their data from backups. Think about that. Three-quarters of companies that got hit couldn’t fully restore from their backups. That’s not a technology problem. That’s a broken backup strategy problem.
The Backups That Don’t Actually Work
Let me tell you what I see constantly in professional services firms. Someone set up a backup years ago. Maybe it was the previous IT person. Maybe it was the office manager who watched a YouTube video. Maybe it was even a reputable IT company that did it right at the time. But then nobody ever tested it. Nobody verified it was running. Nobody checked that the backup software still had a valid license. Nobody noticed when the external hard drive filled up and stopped backing up new files eight months ago.
I’ve walked into law offices where their “backup” was someone copying files to a USB drive every Friday and taking it home for the weekend. I’ve seen accounting firms whose cloud backup hadn’t successfully completed in two years, but nobody noticed because it wasn’t throwing error messages anymore, it just quietly failed in the background.
What Actually Breaks
Backups fail in predictable ways. The backup software loses its connection to the cloud service and nobody notices. The external hard drive gets unplugged when someone needed the USB port and never gets plugged back in. The cloud storage account hits its limit and stops backing up new data. The backup runs, but it’s not actually capturing the open database files that contain all your critical information.
Gartner research shows that 77% of backup failures are only discovered when an organization attempts to restore data. You don’t find out your backup is broken until you need it, which is exactly when you can’t afford to discover that problem.
Or the backup works perfectly, but when you go to restore, you discover that the data is corrupted. Or the restore process is so slow that it would take three weeks to get your data back, and your business can’t survive three weeks of downtime. Or the backup included your files but not the configuration settings you need to actually run your software again.
Data Loss Prevention That Actually Works
Real business backup services for professional services firms need three things. First, they need to be automated and monitored. If your backup depends on someone remembering to do something, it will fail. Humans forget. Humans get busy. Humans quit and nobody tells the new person about the Friday backup routine. Automation removes the human failure point, and monitoring catches it when the automation breaks.
Second, backups need to be tested regularly. Not once when you set them up. Regularly. At least quarterly, you or your IT provider should be doing test restores. Pick a random file and restore it. Pick a random user account and verify you can recover their email. According to Infrascale’s Small Business Backup Report, businesses that test their backups quarterly have a 95% success rate in actual disaster recovery situations, compared to 22% for those who never test.
Third, you need redundancy. A single backup isn’t a backup, it’s a single point of failure. You need multiple copies in multiple locations using multiple methods. This is where disaster recovery planning intersects with backup strategy.
What Professional Backup Services Actually Do
Professional backup services for businesses aren’t just about the technology. They’re about having someone whose job is to make sure your backups are working. Someone who gets alerted when a backup fails. Someone who verifies that restores are possible. Someone who updates the backup strategy as your business changes.
For most professional services firms, this means managed backup services where your IT provider is actively monitoring your backups, not just “providing” backup software and hoping you figure it out. You need someone watching the logs. You need someone expanding storage when you’re running low. You need someone testing restores before you have an emergency.
And you need proper disaster recovery planning, which is more than just backups. It’s having documented procedures for what happens when disaster strikes. Who do you call? What gets restored first? How do you communicate with clients during downtime? These aren’t questions you want to be figuring out while your office is on fire or your systems are encrypted by ransomware.
Quick and Easy
Most backup strategies fail because they’re never tested, not properly monitored, or lack redundancy. Professional business backup services include automated monitoring, regular restore testing, and disaster recovery planning to ensure your data is actually recoverable when you need it.
In January 2026, a mid-sized accounting firm in Orange County received notice that its cyber insurance claim had been denied. They’d been hit with ransomware, had to shut down operations for five days, lost client data, and faced reporting requirements to multiple regulatory bodies. The recovery cost exceeded $300,000. Their insurance policy had a $2 million limit for cyber incidents. However, the carrier denied the claim in full after their post-breach audit revealed the firm wasn’t consistently enforcing the security controls it had attested were in place when it purchased the policy.
This is not an isolated incident. It’s the new reality of cyber insurance in 2026.
Why Insurance Requirements Have Gotten Stricter
Cyber insurance carriers have been getting hammered by claims. According to Fitch Ratings’ analysis, cyber insurance claims increased 74% year over year, with the average ransom payment reaching $2.73 million in 2024. Ransomware attacks have increased in frequency and sophistication, and insurance companies have responded by tightening underwriting requirements and becoming much more aggressive in verifying that firms actually maintain the security posture they claim to have.
For professional services firms such as accounting practices, law offices, and property management companies, this creates a significant challenge. You need cyber insurance because the risk is genuine and the potential costs are catastrophic. IBM’s Cost of a Data Breach Report 2024 found that the average cost of a data breach reached $4.4 million, with smaller businesses often facing costs that threaten their survival. However, maintaining coverage now requires implementing and documenting security measures that many smaller firms haven’t traditionally prioritized.
The Security Controls That Matter Most
Let’s be specific about what cyber insurance carriers are requiring in 2026. These aren’t suggestions. These are baseline requirements that most carriers won’t negotiate on.
Multi-factor authentication must be enabled on all accounts that have access to email, financial systems, client data, and remote access to your network. According to Marsh McLennan’s 2025 Cyber Insurance Market Report, 99% of cyber insurance applications now include specific questions about MFA implementation, and 87% of carriers require it as a condition of coverage.
Regular backups with offline or immutable copies are mandatory. You need to prove you’re backing up critical data daily, testing restoration regularly, and keeping at least one backup copy that ransomware can’t reach. Carriers want to see evidence of the 3-2-1 backup rule: three copies of your data, on two different types of media, with one copy offsite and offline.
Endpoint protection that goes beyond basic antivirus is required. This means managed detection and response, not just a set-it-and-forget-it antivirus program you installed three years ago. Carriers want to see that you’re actively monitoring for threats, updating security software promptly, and have someone watching your systems who can respond when something looks wrong.
Security awareness training for all employees has moved from recommended to required, and it is not limited to a single training session at hire. Research from KnowBe4’s 2024 Phishing Benchmarking Report showed that organizations with ongoing quarterly training reduced susceptibility to phishing attacks by 86% compared to those with annual or no training. Carriers are looking for documented, ongoing training with testing.
Email security beyond your standard spam filter is increasingly common as a requirement. The majority of successful attacks start with email, so carriers are paying close attention to what you have in place to filter out malicious messages before they reach your employees.
The Documentation Burden
What catches many firms off guard is the fact that having these controls in place isn’t enough. You need to document that you have them, document that you’re maintaining them, and be prepared to prove it when your carrier asks.
This means maintaining security policies that spell out your requirements. Not generic templates you downloaded from the internet, but actual policies that reflect what you’re really doing. It means keeping records of your training sessions, your backup tests, your security updates, and your incident response procedures.
When you apply for cyber insurance or renew your policy, you’ll fill out detailed security questionnaires. These are getting longer and more technical every year. Your answers need to be accurate because if there’s a claim, the carrier will audit what you actually had in place versus what you said you had in place. Any discrepancies can and will be used to deny coverage.
What Compliance Readiness Actually Looks Like
Compliance readiness for small business cyber insurance isn’t about being perfect. It’s about being honest about your current state and having a plan to address gaps. If you’re a 15-person law office, nobody expects you to have an enterprise-grade security operations center. But they do expect you to have implemented the baseline security controls appropriate for your size and risk profile.
This means conducting regular risk assessments to identify your vulnerabilities, maintaining an incident response plan so you know what to do when something goes wrong, testing your backups periodically rather than assuming they work, and being realistic about your technical capabilities and getting help where you need it.
Many professional services firms are finding that they need outside assistance to meet insurance requirements. This isn’t a failure of your systems, but a recognition that security policy development and ongoing security management require expertise that most small and mid-sized firms lack in-house.
Taking Action Before Renewal
If your cyber insurance renewal is coming up, start your security audit now, not two weeks before your policy expires. Your audit should include:
- Working through the security questionnaire carefully
- Honestly assessing where you stand on each requirement
- Developing a realistic timeline and budget to address any areas where you are not compliant
Understand that improving your security posture may actually reduce your premiums or increase your coverage options. Carriers are willing to work with firms that demonstrate a serious commitment to security and consistent progress. What they won’t tolerate is firms that misrepresent their security controls or ignore requirements after purchase.
If you’re getting quoted higher premiums or having trouble finding coverage, the problem is probably in your current security posture, not the insurance market. Rather than shopping for a cheaper carrier that asks fewer questions, focus on getting your security house in order. The savings from slightly cheaper insurance won’t help you if your claim gets denied when you actually need coverage.
For professional services firms serving clients in accounting, legal, or property management, your security posture is increasingly part of your professional responsibility. Your clients trust you with sensitive information. They expect you to protect it. Meeting cyber insurance requirements in 2026 is really about meeting the baseline expectations of professional data stewardship.
Quick and Easy
Cyber insurance claims increased 74% in 2024, forcing carriers to require documented security controls, including MFA, tested offline backups, endpoint protection, and ongoing security training. Professional services firms must implement and document these controls accurately to avoid claim denials in the event of a breach.
Look, I get it. Multi-factor authentication is a pain in the butt. It slows you down when you’re trying to get work done, it interrupts your flow with prompts at the worst possible times, and yes, it makes you feel like technology doesn’t trust you anymore. Your team is going to complain about it. Some will actively try to find workarounds. And honestly, I don’t blame them.
The thing about ransomware, though, is that it’s worse.
I’ve been managing IT for professional services firms for over three decades, and I can tell you that the conversation we have after a breach is exponentially more painful than the conversation about implementing MFA. One is an inconvenience. The other is a catastrophe.
The Uncomfortable Truth About Endpoint Security
The professional services industry is getting hammered by ransomware. Accounting firms, law offices, and property management companies are prime targets because you have exactly what criminals want: sensitive financial data, confidential client information, and typically just enough technology to be vulnerable but not enough to be fortress-like.
According to the FBI’s Internet Crime Complaint Center, ransomware complaints increased 18% in 2024, with losses exceeding $59.6 million. However, those numbers only capture reported incidents. Most small and mid-sized firms never report attacks because they’re embarrassed, worried about reputation damage, or they just paid the ransom quietly and moved on.
When someone gets ransomware into your network, it doesn’t just encrypt your files. It steals them first, then encrypts them, then threatens to publish your clients’ private information if you don’t pay. Even if you have backups, which you should, you still have a data breach on your hands. You still have to report it. Your clients still find out. Your reputation still takes a hit.
You know what the entry point is in most of these attacks? Stolen credentials. Microsoft’s Digital Defense Report found that password-based attacks increased 146% in 2024, with more than 7,000 password attacks happening every second across their platforms. Someone phished an employee’s password, logged in as them, and waltzed right through your front door like they owned the place.
What MFA Actually Does (And What It Doesn’t)
Multi-factor authentication isn’t perfect. I’m not going to pretend it’s some silver bullet that makes you invincible. Criminals have already figured out ways around it, like cookie-stealing, where they trick you into authenticating through a legitimate-looking service just to capture your session token.
Here’s what it does: it makes the cheap, easy attacks fail. The automated bot that tries 10,000 stolen passwords against your email server. The script kiddie who bought a dump of credentials on the dark web. The lazy criminal who isn’t willing to put in the extra effort. According to research from Google, implementing any form of MFA blocks 99.9% of automated attacks. Even the most basic SMS-based authentication stops the vast majority of credential stuffing attacks cold.
Think of it like locking your car doors. Will it stop a professional car thief with the right tools and motivation? No. But it will stop the opportunistic criminal who’s just walking through the parking lot trying door handles. Most cybercrime is exactly that: opportunistic.
Why Your Cyber Insurance Company Cares
Something that might make the MFA conversation easier with your team: it’s not really optional anymore. In 2026, cyber insurance requirements have gotten strict enough that most carriers won’t even quote you coverage without multi-factor authentication on all your critical systems. Email, remote access, financial systems, client portals. All of it.
I’ve seen insurance companies do post-breach audits and deny claims because MFA wasn’t implemented properly. It can’t be partially implemented, or “we were planning to roll it out.” Actually implemented and actually used. They will look at your authentication logs, and if they see that the account that got compromised didn’t have MFA enabled, that’s it. Claim denied. You’re on your own for the six-figure recovery costs.
Making It Less Terrible
The good news is that MFA in 2026 is better than it used to be. Not good, but better. You’re not stuck with those horrible SMS codes that never arrive when you need them. Modern authentication apps are faster. Hardware security keys work better. Some services even use passwordless authentication now, which sounds scarier but is actually more convenient once you get used to it.
The key is implementing it intelligently. You don’t need to make people authenticate every single time they access their email if they’re on a trusted device on your network. You can set reasonable timeout periods. You can use conditional access policies that only trigger extra authentication when something looks suspicious, like a login from an unfamiliar location.
You need to train your people not just on how to use MFA, but also on why it matters. Not with scare tactics, but with reality. The Verizon 2024 Data Breach Investigations Report found that 68% of breaches involved a human element, whether that’s stolen credentials, social engineering, or simple mistakes. Tell your team about the law firm down the street that got hit with ransomware because someone clicked a phishing link. Tell them about the accounting practice that had client tax returns published online because their insurance claim got denied. Make it real, because it is real.
The Reality of Small Business Ransomware Protection
Look, if I’m being completely honest with you, which I always am, no security measure is going to stop a determined, sophisticated attacker who specifically targets your firm. But you’re probably not going to get specifically targeted. What you’re trying to protect against is being the easy target, the firm that criminals hit because you’re vulnerable and they know it.
Multi-factor authentication is one piece of a larger endpoint security solution. You also need proper backups, security monitoring, email filtering, security awareness training for your team, and someone who actually knows what they’re doing managing all of it. But MFA is the piece that insurance companies look for first, and for good reason.
If you haven’t implemented multi-factor authentication yet, start now. Check with your cyber insurance carrier about their specific requirements, because they vary. Get your critical systems secured first: email, financial software, anything that touches client data, and any way your team accesses your network remotely.
And when your team complains, which they will, remember that their annoyance is temporary. A ransomware attack isn’t.
Quick and Easy
Multi-factor authentication blocks 99.9% of automated attacks and is now required by most cyber insurance policies. While your team will find it annoying, the alternative of ransomware attacks and denied insurance claims is far worse for professional services firms.











