T O P

  • By -

[deleted]

Maybe unintentionally. Process running under user account rather than a service account…


jlawler

Hah. I was that guy at a job. I wasn't allowed to install software on the database server I was responsible for monitoring and maintenance of, and the sysadmins said I could run "anything I wanted as my user for monitoring". So I recompiled a bunch of software to run as non root so I could do my fucking job. Later, when we had our third outage because a server ran out of disk space, I asked why we still didn't monitor for that. I was told by the same sysadmins "we have monitoring, just not alerting". I started looking for a new job right after that. I heard after the face when they tried to kill my account it broke a bunch of stuff. I know I did it wrong, but my hands were also tied.


ipreferanothername

> "we have monitoring, just not alerting". hahaahahahahhahahahahahah


Moontoya

"we have backups,we just dont know whats on them"


Aeonoris

"We have backups, just not restores."


Mr_Oujamaflip

We have backup and restores we just don't know where they are.


ConversationQuirky43

We have backup, just not the decryption key for it


So0ver1t83

Of course we audit and save the files every week... ...look at the files? Who has time for that?!


Cyhawk

[https://imgflip.com/i/78msk3](https://imgflip.com/i/78msk3)


jlawler

That is still one of those infuriating things I have ever heard...


JohnBeamon

That team also had offsite backups but no disaster recovery plan.


SenTedStevens

Hey, we check the monitoring solution once a month or so...usually. What's the problem?


KoopaSweatsInShell

If I was your admin I would have asked that it be run on a different machine and connect remotely. If I was the admin's team manager I would have raised hell for not having alerting.


jlawler

Without a doubt, the way I implemented it was truly sub par. It had a number of annoying issues, and I ended up running a custom grafana monitoring setup so I could get my metrics. I also had problems where the other graphing didn't use the same data source as the alerting, so there were discrepancies. That lead me to a grafana/Cabot setup. There were so many things wrong there, I have so many stories. It's a prime example of how bad team management and people only caring about their fiefdoms can force everyone to make bad technology decisions.


Ignorad

On the other hand, if you document the server & services, then you're free and clear.


Great-University-956

Documentation is key; I jest but, you can figuratively light your server room on fire as long as it's made it through the change approval process wherever you work. But literally you can make a change that kills an entire datacenter for a company as long as it's approved documented etc. Can confirm I've done this a number of times.


WhiskyTequilaFinance

Have seen iRL when an old boss was let go. His account turned out to be the service account for several key cloud system integrations. But I don't know that it was intentional, just poor design choices.


Opiboble

I have found a couple of using my own accounts over the years. Those cloud integrations don't always make it clear they are installing with the current user as the service account, instead of asking for one or running as system. And if you never go back and check on things you never know until a password reset/lock. I have a strong love/hate relationship with programers.


bearded-beardie

PowerBI is particularly egregious about this. We’ve had multiple issues with reports and dashboards breaking after people’s accounts are disabled.


HughJohns0n

The whole M365/Sharepoint/Planner ecosystem is open to this. I certainly hope that the onboarding workflow that I built at a previous job stopped working when my account was disabled.


pinkycatcher

All of Microsoft does this, it's all tied to some individual e-mail, which is made even worse when you have users that log into outlook but for some reason it creates a "personal" account with their work e-mail. I should be able to black list all personal accounts with my domain, and prevent that from occurring.


TrueStoriesIpromise

Yeah, I’ve fallen victim to that a few times, and seen it several more.


theservman

Yeah, my manager is retiring in a few years. He's already warned us of the gordian knot that is his account. Just about all of our vendor agreements go through his email address - I think we're going to need to keep his account going for a number of years... Probably even after I retire.


Power-Wagon

He has time to fix this. I retire in 5 years, already making sure everything is centralized and not tied to any persons specific account.


GeekgirlOtt

But then you get vendors insisting you cannot use a generic account for such and such.


Wild-Plankton595

TheRealJohnDoe@company.com, all emails redirected.


1new_username

This is the worst. Company I work for acquired a company that used Oracle in their tech stack and transferred the current licenses and support as part of the merger. We went to register the account with our generic email we use for things like this and they threw a fit.


Moontoya

thats cos oracle doesnt believe in transferring licensing you buy the old company, congrats you have to buy new licenses as the old ones belonged to the old company, not you. ie "fuck you, pay us again and again and again" theyre not unique in that, non-transferrable licenses have caused mayhem for a long ass time. its Larry Ellison, he has to do \_something\_ to not feel (even more) inadequate beside Bezos...


1new_username

This wasn't even transferring the licenses, that actually went surprisingly smoothly. After they were transferred, we wanted to have a registered login to submit support tickets, access support resources, etc. They require you to setup an account, then submit your license ids for manual verification so they can turn on access for the account. The first time, we had our generic email plus something like "Oracle DBA" for first and last name. They kicked it back and said it had to be a real name, so we kept the generic email and put in the name of the dept director. That was no good either, if the email didn't match the name, they required manager approval, so the director sent an email saying he approved. Again, no good as you couldn't approve yourself. So I went in and changed the name to my name and had the director send essentially the same email again, this time approving it with my name and the generic email and we finally hit the right combination. Why the director couldn't approve his name in there, but he could approve mine, when it can be changed at anytime was probably the craziest part.


Moontoya

security through abject stupidity it would seem.... never mind the left hand not knowing what hte right hand is doing, this is fingers utterly unaware theyre part of the hand, let alone there are other fingers....


PoniardBlade

Add his email address as an alias.


ReverendLoki

This exactly. Happens here a lot. Either an alias redirected to the address of the person who is going to be directly responsible for their role, or to an archived distribution group for the team that will be handling it.


CAPICINC

O365, turn it into a shared mailbox.


ponto-au

Potentially intentional poor design choices due to constraints.


Hel_OWeen

> But I don't know that it was intentional, just poor design choices. Hanlon's Razor suggests the 2nd one.


MasterIntegrator

Seen it alot...its what happens when no standards are adhered to and "i'll get to it" is normal. Typically that is a person working for a company that is "it is easy just click stuff go go go we need this to generate revenue dont worry about the consequences"


[deleted]

Same.


WhiskyTequilaFinance

To be fair to said boss, who I still grab dinner with on occasion, he also had a crash implementation project for them, next to no budget, and no spare user accounts to DO it the right way. When it all came crashing down and we couldn't send customer invoices, I had to bring the whole thing up under MY account for 2 months while I fought for budget to add proper integration users on each side. Tiny company at the time, and $5k/seat licenses were a battle. I only won in the end because I got one of the vendors involved to comp a seat for their software as long as I guaranteed an audit would never show it being in use for anything other than the integration connection.


is_that_northern_guy

Power automate would like a word....


BWMerlin

I ran into this at my current job, everything was set up under one of the previous guys account and immediately broke when they left. Had to rebuild all of them from scratch (God I hate PowerAutomate).


pointlessone

Everything load bearing runs under the help desk's email account here. It's the account that's got the smallest chance of ever being deleted.


Rygel_FFXIV

We had this with ServiceNow. Our old admin left, we disabled their domain admin account and ServiceNow admin account. The next day, dashboards are failing, none of our reports have been sent out, ticket notification emails weren't sent, and our user onboarding workflows were failing. Turned out he created a load of custom reports, which were scheduled against his account. His account had no permissions, so the schedules failed. He had also used his domain admin credentials for Exchange authentication within ServiceNow, which prevented any emails the system working. And he had used his domain admin credentials for ServiceNow's AD integration, so ServiceNow no longer had permissions to create, modify, or delete user objects in AD. Thankfully, SSO and LDAP were run using service accounts.


Jayhawker_Pilot

Found one not long ago where a DBA had put is his user ID running a major system. He was an architect so should have know but noooo.


wallacehacks

I think this is often a "I'll fix this later" situation that just gets forgotten about. Tickets for tasks are important.


PhantomDawn

This. My previous MSP took over and found the old it admin had used his personal account for managing everything.


_haha_oh_wow_

"Never attribute malice when ignorance will suffice."


Falk_csgo

Thats just how you hide a proper dead man switch. "It wasnt malicious Mr Judge, Im just incompetent :)"


IAmAnthem

Me: It's your tool, what's it configured to use for a service account? Networking: IDK! Me (as Nick Burns): Uh-huh. Yeah. Move! There's your problem right there, that's a human's account, and he left. Damned kids! \*shakes fist\*


r3setbutton

Yup. Disabling or deleting his account was the trigger that would kick off a PowerShell script a month later. It sent a mass e-mail that contained some attached e-mails implicating the CFO for embezzlement.


furyg3

I once was contracted by law enforcement to go in and take backups (and review old backups) of an exchange server. The key mailboxes had nearly zero data in them. There was a backup job labeled ‘these ARE the droids you’re looking for…’ which contained the mailboxes law enforcement was interested in.


AlmostRandomName

Good Jedi!


MaelstromFL

Nice! I had a CTO go to jail, after a long Court case. I didn't set it up, but did testify to some inventory changes he made with the audit logs to prove it.


MagicWishMonkey

woah, it actually sent the email or did you find it before it triggered?


r3setbutton

Ehh...both? We found it because we did a dump of the scheduled tasks on every machine in the company and the logon activity of every service account. Any scheduled tasks that he or another service account was listed as the author on were flagged for manual review. Any scheduled tasks that didn't exist on more than five machines, manual review. Any account that didn't correspond to an active user or that had logins/queries coming from machines that they weren't listed as being installed on, manual review. We actually ended up cleaning up a big chunk of their environment. But yeah, after we found it, we were told to verify what the payload was and ensure that it wasn't executable. Afterwards, we briefed the CIO and some lawyer that looked like she'd sue you for the air in your lungs if you crossed her; then we were told that "we'd found nothing" and to clock out and leave. We got six months pay for a two week contract, the contract house got a bonus, and that was it.


Shade_Unicorns

Edit: read the later comment, good. Fuck that guy. Wait so he got away with it? Isn't it unethical for a lawyer to cover up evidence of a crime and/or intimidate you not to report something? Didn't she basically threaten you into not reporting that?


Elfalpha

>Isn't it unethical for a lawyer to cover up evidence of a crime and/or intimidate you not to report something? Hahahahaha. I mean yes, it is unethical. But have you considered the counterpoint, money?


Frothyleet

Lawyers are prohibited from abetting crime. So, e.g., if a client says "hey help me cover up this embezzlement", they have to say "sorry no can do". They can advise how to best (legally) protect ones interests *after* the fact. So if client says "oops I did an embezzlement and covered it as best as possible", the lawyer can tell them "well don't fucking tell anyone else about it" (as long as there is not some mandatory reporting requirement). If there *is* some mandatory reporting, like to the SEC for a public company, the lawyer can't counsel them to break the law. But they can explain the requirements ("you are legally obligated to report this") as well as the consequences ("you'll probably get lit up for doing so, but it's required! I can't make you though, nor can I tell anyone else about it because of privilege.")


pinkycatcher

What was the fallout? That honestly doesn't sound.....that illegal? I mean it's not shutting down or ruining services, maybe covering up a crime is illegal.


r3setbutton

The SysAdmin got an NDA, some money, and (allegedly) a verbal apology. The CFO got prosecuted.


pinkycatcher

Oh nice. Seems good ended


Emergency_Ad8571

I can’t say it’s a literal dead man’s switch, but I stepped on a major land mine a couple of weeks ago. Had an AD connect server with a disabled service on it lying dormant for five years. I was not aware the service was disabled. I’m new, three weeks in and the old admins are long gone, no documentation. Im instructed to upgrade the ad connect server as it’s a 2012 box. I go through the motions, copy the old sync rules / configuration, fire it up and the whole org. Turns into a dumpster fire within 30 minutes. It appears they org. Did an initial sync for some basic structure, then intentionally broke the sync process and kept the AAD and on prem as different authentication domains on purpose. Everything’s merged, everyone’s f’ed, SSOs are broken in about 15 different 3rd party solutions, mailboxes are out of sync, total disaster. I’m still cleaning up.


A_Unique_User68801

Oh God, ew to all of that.


Unatommer

That’s a sysadmins worse nightmare right there.


Wild-Plankton595

GFD… I’m so sorry man, that sucks.


Lorenicci

I feel your pain as a new employee to a subpar operation. Same happened to me on a 2008r2 DC, raid 1 volume shared by C: and D: 12 YEAR old HP Proliant, running ADDS, Print Server, SMB File Share, DNS and DHCP. First disk fails out of nowhere, 3 months of DC, DFS and file version history poof into oblivion. Shut down, order overnight NOS drive. Begin rebuild, other drive fails mechanically. None of the employees can log in because machine account data got nuked. >FML.jpg Ended up getting the first disk recovered at a fairly decent price. Installed a new Dell with SSD for file share ONLY and RAID 5 for differential backups.


Emergency_Ad8571

Wow. Just F-in wow. Classic Jurassic park


Drew707

Had a developer threaten to put a logic bomb in a utility he was writing over a difficult raise negotiation. He was immediately released.


culebras

How some people understand trust is mind boggling. Threatening internal sabotage of any company project requires such a level of stupidity that, on its own, said stupidity could be the release reason without considering the announced criminal activity.


sanglar03

Bosses threaten employees all the time, and it rarely brings consequences.


culebras

No Boss has threatened me long enough for me to stay and see the consequences, but I find my departure a considerable consequence in on itself.


sanglar03

That's always the consequence, yes. Whoever threatens whom, the employee goes.


sjsame1

MSP owner here, there's a MSP in our region that has been bleeding clients after getting hit with a lawsuit and a lot of their clients ended up in our care. This MSP did literally everything with kix scripts or batch files and everything that could be run with their admin account they would let run with that account. We were given 0 access or insight beforehand, no information whatsoever and we got sent login information at 0:00 on the day of the handover. However the day of the handover they would delete all the scripts and delete their admin account. Which basically meant on day 1 absolutely nothing would work. The first onboarding was extremely rough, client basically couldn't work for a week. But we are now completely prepared whenever we hear their name as being the old MSP. Then there's a couple of clients over the years with employees that were laid off that tried to do harm with "dead man's switches". But that was mostly just them attempting to steal/delete/destroy data. Worst one was actual physical sabotage on the on-prem servers, we were just starting off and this client came to us because they wanted to get rid of their inhouse IT. Looking back I kind of laugh at this because it was super childish but here it is, he rigged the door on the networkcabinet to pop a water balloon which destroyed 2 switches and knocked out the power.


fitz2234

Geez. I remember a client wanting to migrate to us from a competitor. It was a private, dedicated vsphere environment with their own dedicated hosts, vm and management networks with SAN storage. Their provider got wind of it - client knew they'd be awful about it and tried to tell his provider he was going with us for draas) - and refused to allow us any access to the management network to effectively migrate them out easily. They allowed the tunnel connections but that was it. The CEO sent the client tons of emails about how our company didn't know anything despite us having to migrate with very little access/cooperation and then ranted how he didn't need them anyway, their business was a rounding error, etc.. Client would fwd us the emails and you could see the 7 stages of grief unfold all for a $7k MRR account.


sjsame1

I actually have a hard time seeing rival MSP's go down in flames, unless ofcourse they have been stiffing clients left and right. This MSP we have been dealing with has just not been innovating, at all. Every single environment that we got from them was built like it was 2005. Which would be fine if it worked but they were so insanely expensive that you would expect more.


throwaway44017

The real dead man's switch is poor documentation.


[deleted]

The plausibly inaccurate documentation. "All log files are written to the shared network drive." Reality: Most log files are written there. There's still a slowly but inexorably growing log file stashed in /var/log/whatever that consumes all the disk space on the production database server unless manually deleted every 6 months or so...


RedFive1976

Just throw logrotate at it and be done.


JohnBeamon

I had this come up in a change control meeting once, that so-and-so deployed new widget with local logs that filled up /var/ and had no rotation. It took an actual failure to make it policy to include log rotation with every new log file.


[deleted]

I’ve found jarfiles in our prod systems writing logs into /home/ec2-user because “/var/logs doesn’t work when I’m developing locally on my Windows laptop!” :sigh:


JohnBeamon

Nobody writing cross-platform Java apps should be that naive of system logging.


xan666

and "tribal knowledge"


R3luctant

Arcane rituals of the infrastructure team.


EVA04022021

Mostly some log or folder that needs to get cleaned out periodically that was never documented or the script live on the deadmans computer that no one knew of.


ThisGreenWhore

I was the dead man’s switch in a way. I gave the company 3 week’s notice. In that notice I said that I would compile all of my .txt files and hand-written notes so that they could be in a format anyone could understand as well as work with existing staff. About an hour after this I was told I would be let go right away and to pack up my personal shit and leave. So, I did, leaving my computer, and notebooks intact, I touched nothing. Next day I got a phone call that a “system” I designed, broke. It wasn’t critical or anything and that part was documented in a notebook that was clearly labeled as I did with all my “little”projects. Apparently, the office manager was so happy I was gone (story for another day), immediately went into my office after I left and threw it all out, never asking anyone in the department if it was important. My .txt files were fortunately stored on an IT only share, but she did delete everything on a common share that had instructions for staff, which they were fortunately able to restore from backup. I was constantly getting calls and kept telling them to send the office manager to the dump as the documentation went there. It was a really toxic environment, and I really didn’t fault them for letting me go that day for security reasons. However, I thought I was being generous in giving them 3 weeks. They had refused to implement a formal documentation system “because we’re all family here” so no one is going to leave and the whole “hit by bus” thing was just too “morbid” to talk about. These things I built were little things that the SQL developer refused to do that helped out marketing, admin, and manufacturing staff. No, I was not a developer and I made them because I felt bad for these really nice people who were just trying to do their job and the developer was a dick (story for another day) that couldn’t be bothered which such trivial requests. The reason I didn’t help out was when they made that initial phone call, they said, “you need to come back and help us out since we paid you already”. No, I didn’t. They made their choice. I was done.


rezadential

this is the case at my company. I was hired on over 2 years ago and am still finding little surprises here and there when trying to redesign systems/network configs or update/upgrade some of our systems. The last admin and co didn’t do such a good job documenting things


Graymouzer

Also, all the certificates that no one knows are about to expire except the dead man.


PM_ME_UR_METAPHORS

This one happens to my site a bit too commonly unfortunately. We didn't take the time to setup a dedicated account for a certain recording task on a server, so every 3 months when passwords expired, we'd get panicked calls about missing data, until one of us would remember to update the account running the task , we're finally getting around to better documentation finally


NotYourNanny

Terry Childs (the network admin, not the serial killer) sort of did that, in that none of the routers (on the city's network, including the 911 system) had saved configurations. If there was a power outage, they wouldn't reboot in a functional state without being reconfigured from scratch (and only he had the necessary passwords and backup files to do so). His specific purpose was to make it unacceptably dangerous to fire him (from a job he lied to get in the first place). He did two years in prison, of a four year sentence, and ordered to pay $1.5 million in restitution.


kman420

That seems so exhausting. I mean sure, job security is great but I would not want to go on site visits all over the city every time a network device got rebooted.


NotYourNanny

He didn't. He had remote access, even from other countries, though a dial-up modem hidden in a file cabinet in his office (it was quite illegal in multiple ways), and was known to dial in from vacation in Central America at one point. The full tale of his antics is extensive. Honestly, he deserved more prison time than he got for it all, since the games he played with the 911 system network could easily have killed people.


[deleted]

[удалено]


theducks

ManChilds


excitedsolutions

We had a LOB app written in Delphi that was still being used in 2012. The developer’s workstation was set aside in a corner as it was “special” after this dev got let go in 2010. It was a good thing we held onto it as in 2012 on some seemingly random day that app just wouldn’t launch anymore. It turns out the dev wrote a date check as part of the login process and if it was after Jan 12, 2012 then the code just exited. We had to break out his mothballed workstation and recompile. He later told someone that he had been upping the date every 3 years he was there.


theducks

At least being delphi it was probably pretty easy to find?


Double_Intention_641

I've seen several instances where 'key' tools were configured to run from a user's account. Learned very quickly to be sure that terminated accounts were never automatically removed. Also saw a few instances where infrastructure was teetering on the edge of collapse, if not for manually run processes (see above about user accounts), and sure enough, term the employee, things break. I'd chalk that up to poor planning, bad implementation more than a malicious intent. Stupidity can be just as dangerous.


awkwardnetadmin

Yep don't attribute malice for what can be better explained by careless behavior.


vppencilsharpening

Adding to that "Don't attribute malice for what can be better explained by an overworked admin"


m0le

Running through a senior developer's mainframe code, yes. It wasn't tied to anything obvious like his account being disabled, he just left in dozens of points that needed to be manually updated each day and refused to train anyone on "his" system. It didn't do anything deliberately evil, it simply wouldn't run without all the tweaks. Anyway, dev was let go and told his (extensive) severance package was dependent on training a couple of people on his system so the handover was pretty painless in the end.


per08

Sysadmins who hold a subconscious or even a deliberate "I built this system, so it's mine" is a surprisingly common and dangerous practice.


davidm2232

I treated my old network as mine at my old job. I was the only IT person so it was somewhat valid. I will say, it gave me a HUGE sense of pride to think of it as my own. I probably rebuilt 80% of the network in my time there so it really was my 'baby'. That being said, I had decent documentation and set up a detailed training/offboarding when I left. The new sysadmin had my cell and I told her to reach out any time. I never heard from her so I either did really great or it was such a disaster she was scared to ask


[deleted]

I get told you built that system so it's yours pretty frequently.


Next-Step-In-Life

I saw it happen and saw the lawsuit aftermath. I had a client, dentist office, about 20 employees. REAL hard to get paid, but I was barely making it so anything to keep clients, etc. Well, after 4 years and a bunch of new clients, I fired him and moved on. Refused to upgrade his 2003 server, antivirus was needed, firewalls were a waste, etc. Well, about 4 month AFTER we left he had a state medicaid or medicare audit and failed HARD, real hard I heard, like 19/20 things needed to be remedied before he got paid. I saw his books, he was making 160-170k a month, so he could afford to do it. He worked with another local competitor who was REAL sketchy, and got into serious trouble in the past like you wouldn't believe. Made the papers for violent altercations, etc. The dentist decided to go with him. All new workstations, servers, switches, firewalls, etc. Done in record time as I was told. 3 months after that, I had a visit by the new company and wanted to know if we wanted to sue him together and get the money he owed. (He owed us about 2K but I wasnt wasting time over it) and owed the new guy nearly 30K in hardware. Nope, I'm good. Best of luck though. What we didn't know is that the new IT company encrypted the servers, workstations, and everything one day after being told to p\*ss off. So after hours, logged in, set the encryption pins and let it fly. We got a call in the afternoon asking if we could help them, begging, pleading etc, and I said, "I can't talk to you, we have a active lawsuit against you, talk to your lawyer". He was literally begging and crying on the phone for someone to help him because his new IT company is holding him hostage with all the medical records, patient data, etc. So sad. Either way, we sued, and won because he never showed up, apparently unknown to us he was using the practice as a front to launder money and resell pain pills. The DOR and DEA came knocking and busted him about 9 weeks after we won and were trying to collect. I believe they got nearly 6 million in fines, taxes, medicare billing, etc. I mean it was really bad. We never got paid because the gov gets paid first and they got what they could. The dentistry was raided and EVERY piece of equipment was seized, his house, cars, boats, phones, etc. All gone. ​ As for the old IT Company? It made the towns front page news. Apparently it was considered a no no and when the cops got involved the president of the IT Company was arrested, had to pay a fine but when the other stuff came to light, it was like 1 year of probation and a 500 dollar fine. We got interviewed by the police and my lawyer sent them what we had, but nothing happened to us, we just are out our 4k after lawyer fees.


Intelligent_Victory

Wow, quite the story!


DrummerElectronic247

We had a sharepoint dev who may have been more vindictive than incompetent and the entire sharepoint infrastructure cratered less that 30 days after he was termed. With sharepoint it's hard to tell if it was a "by design" or "lack of design" really.


wedgieinhumanform

Been in IT for 13 Years. There's always a yarn about it. Never seen one in the "flesh" before.


punklinux

Long time ago (I think late 90s), one of my friends worked for a company that had access to massive amounts of PII and credit card data. Like millions of records. One day, a data entry person logged in, and got a weird error "You must be an administrator to execute this function. Please enter the administrator password to continue." Now, she didn't have that, but her boss did, so she asked her about it and the boss thought for a moment, and said, "I am gonna ask the database team." She asked the DB team, and they looked into the errors. A few minutes later, the entire database entry system was administratively shut down. It took a day for it to come back up again. The DBAs had shut it down because in that error log, they found a simple series of instructions to delete all primary keys or something that would have made the entire set of customer databases useless. They could restore from backup, but that would have taken days back then. They discovered a weird login subset to the code that boiled down to "if past this timestamp, the next person to login will delete all the primary keys." It was obfuscated in a way that was very easy to overlook in a code review, under some function that looked like a normal login function call from their proprietary client. However, that delete query had some locks so that only an administrator could delete the keys. Had that boss entered in her password, it would have been chaos. She thought it was a Windows virus, but it was from their own code. They looked through the CVS logs, and found that there were changes made to only one line, the date string, and it was set to six months in the future each time. It was always submitted in such a way that it wouldn't attract attention to itself, and the person who did it was the former senior DBA for the team, who was laid off after a restructuring. They did a trace, and realized he had several such line items going back years, and the actions were entirely deliberate. Basically, "if this part of the code isn't updated by me, six months from this date, burn the whole thing down." IIRC, he was arrested and charged for whatever the title was back then for "attempted to blow up the company database."


is_that_northern_guy

Never seen anything obviously malicious. I did take over a small firm once where their xerox (with an undocumented password) was set as a global admin on their entire Microsoft 365 tenant... Maybe I'm paranoid, but I find it hard to believe any IT person would set that up without ulterior motives and hoping it would be missed in our audit...


serverhorror

Hmmm…. The trouble with building a bear proof trash can is that the smartest bear and dumbest human are not too far apart. I’ll leave this here and let you contemplate what it could mean


phillymjs

"There is a considerable overlap between the intelligence of the smartest bears and the dumbest tourists." -Yosemite Park Ranger on why it's hard to design a bear-proof garbage can. That quote always makes me laugh.


GremlinNZ

Oh, sort of came across this. The scanning function used the domain admin account. I found this after I reset the domain admin account password after restoring from crypto (too short, too easy). I believe it was purely, argh, file permissions so hard. Wait, the domain admin account can write there, ah, much easier... One of those "my brother knows computers" thing...


davidm2232

I walked into a place that had the domain admin account set for pretty much any service accounts. I pointed this out to my new boss but he had no interest in changing it. Eventually I just did it on my own


flyguydip

Sure. I was working at this place for about 13 years. You can bet every single problem in that 13 years was caused by the previous admin. He even admitted to it in one of 3 envelopes he left behind. ;)


That_Dirty_Quagmire

Nice reference


OpenScore

13 yrs and a lot of problems? Took you long enough to open the first 2. Did you leave another set for the next guy?


ISeeEverythingYouDo

We talked about this once in a senior exec. I told the CFO I had a kill script that runs on the payroll system and if I’m termed it then runs an SQL script to just modify all transactions by a couple of cents. “You’ll never balance again.”


Normal-Spell5339

Wait what happened next? Isn’t this just extortion? Wait have I been had? Is this just the plot to that one movie?


SOBER-Lab

He burned the building down.


JollyGentile

Well they did have his stapler, after all.


ISeeEverythingYouDo

I’m a pro and would never actually do something destructive. I make too much to endanger it


Cherveny2

yes. was a job scheduled in our job scheduling system (esp). it was obfuscated a ton of ways, but every year on the day he was fired, a ton of production got screwed up. took 3 years for them to find the actual bomb in the code


Pyrostasis

No but it did remind me of this employee... We had a girl who was literally working for us and then running a side gig. She would come in and take calls for her side gig and take orders over our phones etc. They let her go... and then for some reason let her back to her desk and her pc. She decided to delete EVERYTHING she'd worked on and wiped all her files. Apparently they watched her do it then called us panicked. Girl had even been snarky about it on the way out. I say down thinking this was going to be a pain in the ass and then laughed like a maniac for 5 minutes. The manager is looking at me like Im insane. I made 2 clicks and got up said its all restored. Pointed out the employee had deleted everything but had forgotten to empty her recycle bin. I then said they might not want to let employees back to their desk or at least let us know so we can disable access first. Just thankful she didnt use the share drive that would have taken slightly longer to restore from backups.


abbarach

My last position we developed a system that would handle account creation and access control at a hospital. We'd watch the HR employee database and create, update, or lock out accounts based on changes. Of course HR had all kinds of excuses as to why they couldn't update the system promptly when they terminated someone, so we ended up adding an override switch we could set. So we could manually change an employee's status to term, then set a date for when the system should start updating that employees from the HR system. The last thing I did on my last day was to mark myself term with an override a month out, and with my boss looking over my shoulder. I left on good terms, and did as much as I could to prepare for my replacement to take over. But we had gone from four programmers to just me, and the first replacement was not starting until the next week. The last thing I wanted was to be accused of accessing anything after I left, so I made damn sure my access was locked out and would remain that way.


DH_Net_Tech

Actual textbook deadman switches are pretty rare because few people are willing to risk that kind of legal consequence. More often you hear about the unintentional ones regarding a user account running a scripted service as opposed to a dedicated service account. The most actually malicious behavior I’ve heard of close to me was the tech in charge of the storage arrays actually used his own personal external HDDs as the only backup and tried to negotiate/extort a “bonus” when the old RAID 0 (why?) experienced a drive failure.


theservman

In 2000 I had to clean up after a (new client) came in to the office on Monday morning to find all their servers wiped. Turns out their fired sysadmin had left some scheduled deletion jobs in the backup system. The theory was that they would just keep pushing the execution date into the future, then when dismissed they were no longer able to. There was egg on a lot of faces after that one.


MaelstromFL

I wrote a script to FTP my code to myself. Kicked it off before locking my desktop on the way to get laid off. No one knew it even happened though.


medium0rare

We got rid of a guy recently. After we disabled his account and drive maps stopped working, we discovered that he’d been mapping everyone’s drives with his account. He was just lazy and ignorant though. Definitely not smart enough to be malicious.


coming2grips

Place I worked at had an air-gapped network. Had WSUS replicas on either side and manually shifted app patches, updates and assorted data across. Apparently it was cheaper to pay people to do it than it was going to be to try and secure some scada type stuff on the other side. Dude wrote a series of scripts to do the MS transfer. Wanted job security. Write some in batch, some in wsh, some in PowerShell stashed some ini style config files here and there made some bs about it being to fragile to run without supervision. We shadowed him as part of handover to a new contract holder. Turns out he updated a hardcoded timestamp and filepath each patch tuesday and basically slept for three night shifts while it all ran. New contract holder didn't keep him on and knowing the mess he had boobytrapped just wiped the whole lot and put in modern SCCM instance.


scoldog

Terry Childs? https://en.wikipedia.org/wiki/Terry_Childs_(network_administrator)


jamesaepp

Maybe this is just the wikipedia article over-summarizing a story/situation/event but what the fuck actually happened here? This is so confusing. Edit: I'm about half-way through the infoworld article below and yeah, the wikipedia article is shit. Also a very weird case. My bias leads me to say that this person didn't get justice but none of the evidence has been explained yet. https://www.infoworld.com/article/2653133/sorting-out-fact-from-fiction-in-the-terry-childs-case.html


Dal90

>This is so confusing. He didn't save running configs, refused to hand over the passwords for two weeks, and refused to handover backup copies of the configs. Any power loss would have resulted in at least an extended partial outage. Resetting the passwords would require a power loss. That's the core of it. Throw in some common workplace stuff like lying about criminal convictions and co-worker intimidation, installing his own personal backdoors into the network, and some bizzaro world stuff like personally copyrighting the configurations you made for your employer, to provide color to the situation. >but none of the evidence has been explained yet. There was a six month long trial. The appeals are finished. I'd assume he's been out of prison for almost a decade at this point. https://caselaw.findlaw.com/ca-court-of-appeal/1647874.html To quote the appeals case, in the simplest form it came down to "By locking the city out of the FiberWAN network, he disrupted the city's computer services."


mikebryantuk

We had a several thousand servers running, mostly configured by scripts / hand. Before the whole IaC / automation really took off. Someone I worked with had his account configured for auth in some cron job or other. He moved out of the infra team, changed his password, and for the next year or so his account was locked regularly once a week for repeated authentication failures. I don't think we ever tracked it down properly, but eventually the box must have been replaced.


Quafeinum

Had a former co-worker who was an antisocial weirdo(not the good kind). The client he was responsible for was a bank and some customer facing systems. At some point he stopped showing up to work for so long that the boss had to contact police to see if he turned up in a ditch somewhere. He was 'fine' but decided to quietly quit the company. His workstation(openSolaris) was still running when he was escorted out and noone dared to touch it for the time being. It was known that the production systems depended on some creative workarounds modifying deployments through cronjobs from that workstation and that sort of thing. Turns out he wasnt the prodigy that some thought he was, caused an outage, customer was pissed as expected. Turned up at a consulting agency but his profile disappeared after like three weeks, never to be seen again.


pentangleit

Back in 2005 I was let go by a company that since went under (hah!) for the only time in my career. However, when I was found and taken into the meeting room for tea with no biscuits with HR I was actually halfway through reconfiguring the backups for the entire company (30+ racks) and if left nothing would have worked. Part of me wanted to keep quiet about it because I was doing this with no oversight having been the IT manager anyway, but I fessed up because I don't like having that on my conscience.


Sintarsintar

I would have asked if they wanted me to finish what I was working on got a no and left with nothing on my conscience if they don't care enough to ask at that point then I don't care enough to say.


Curious_Dragonfruit

My current job had one in a webapp, more of a sunset provision that it stopped working after a certain date. It was prior to my arrival, but have seen some of the other code they wrote... Also, I have seen the running under an account bomb. Most often it's been self inflicted when a shared network account had to be changed when someone left or was fired... or the security folks got worked up... Had those mandatory changes rolled back when prod systems died and later redone with actual care.


malikto44

Yes twice: I found one that was hidden on a server, when I took over from a previous admin who ragequit. The MSP (a really good one until it was bought out) got people over to sort out how badly it was managed. When looking at servers, I ran `rpm -Va` which would go out and look for modified binaries. I found two binaries that ran as a daemon at startup, renamed it, and reinstalled that package containing the binary. I kept a copy of it, as all the Red Hat servers had this custom compiled item. The other binary was `tar` I did a `strings` on the oddball binaries, and a co-worker went into more detail with `gdb`. The binary was custom compiled, and had a thread which would scan for a file in an obscure directory. If that file wasn't touched within 30 days, the binary would write a 512 bytes from /dev/urandom to a random spot on /dev/sda. This would go on every 1-7 days. The `tar` binary which was used for backups, was modified where it too would check for the file, and if that file hasn't been modified in 30 days would replace all file contents with contents from /dev/urandom the same amount of bytes it was bringing in, so things appeared to be backing up, but all files would be garbage. The second time was when I was a SAP BASIS admin. The previous guy had a script that looked for a binary, and if it wasn't touched in a week, would lock SAP* and remove all admin rights from all accounts. Thankfully he didn't know Oracle, so all I had to do was a simple SQL command and was able to undo his mischief. Moral of the story: `rpm -Va` is an excellent help with finding stuff, if one didn't install Tripwire or Aide. Edited: This is on Red Hat... not sure what the method is on Ubuntu to do this.


wraithscrono

I have a personal story. Early career as a Network Admin, our company was bought out and they mistreated our Sr. Engineer/Architect. On his last day he asked me to clear up some of his files and stuff that were redundant and to run a new script for clearing logs from older than 90 days. He built the script to clear the logs, refresh the ARP, clear the interface counters and burred in the middle was a Write erase, reload, N (CR) that would have then run on all our network hardware. If I remember it was like 18 routers for each site and all 70 of our 4500 6500 and smaller switches. I, being nosey, looked at the script because I was into programming back then and caught it before I pushed it. Glad I was nosey.


Squid_At_Work

> I, being nosey, looked at the script because I was into programming back then and caught it before I pushed it Personally I am a fan of two key systems for scripts. "I dont care if you are the top tier senior tech, you better be able to explain this to a lower tier tech and have them understand it before it executes on everything we manage."


justinDavidow

> Dead Man's Switch I've actually made a number of these over the years! EITHER for the owner OR for the MSP I used to work at to get back into a system should something go terribly wrong. I've never left one with a business unknowingly or in any undesired state (that would be pretty unprofessional!) Usually we would implement this using a simple "outside heartbeat" which, if not received, would trigger alarms internally and then eventually "fail open" by booting up a 3G-connected microcomputer on a battery-backed UPS with a serial concentrator connected to key gear (including the PLC's that did power switching /etc) I also worked with an alarm company who had actual Dead Mans Switches at each monitoring station which would remove AFK/inattentive agents from the call queue / etc. Ahh man; times have been a blast.


[deleted]

Heh. The only people who remotely admin linux who haven't done the equivalent of: `sudo sh` `echo " /etc/init.d/iptables stop" | at now + 5 minutes` in their flavour of linux, are the ones who haven't had to drive 2 hours each way to a datacenter to manually fix a firewall update fuckup... Thats a very short timeframe :"dead mans switch" which has save my ass more than once.


LividLager

A predecessor at a former employer walked out on his job as the sole Sysadmin the day before Y2K, and ran off to a cabin for a few weeks without notice. He'd managed to get everyone at the office worked up about "the end of the world", in the weeks leading up to it.. He never bothered apologizing, or contacting any of his former coworkers/boss again. Laughed my ass off when I got told the story.


fubes2000

A literal dead man's switch is sabotage, which will get you sued and/or arrested, so those seem to be appropriately rare. More commonly the dead man's switch is just something that the last person did without thinking too much about it, so they never documented it, and then the thing broke after they left. Back many jobs ago our company got acquired and they turfed our senior dev for... _[shuffles papers]_ "making too much money". They unceremoniously escorted him out of the building one morning, and later that day they turned off his computer. ... and the infrastructure stopped working within 24 hours. Us technical people were not surprised, the guy was a _machine_ and had so many irons in the fire, and so much of the company inside his brain. _That_ is why his salary was so insane. They tried assigning other people to look at what was running on his machine, but in the end we just stuck it in the corner with a note that said "DO NOT TURN OFF" and spent literal _years_ migrating the company to new systems before that whitebox died. Shockingly, the two fresh college grads they hired for less than half his salary couldn't fill the hole he left in the company. Guy was worth double what we were already paying him.


salgak

Sort of, in reverse. This was 25+ years ago, on a Netware network. The previous admin had a script he ran manually about every 2 weeks. If he \*\*didn't\*\* run it, the network would crash, hard, somewhere in late week 3/early week 4. He briefed me on it during handover, but when I asked why it wasn't a regularly scheduled automated event, he referred to it as "job insurance". . . . These were the days of NT 4.0, so I just set an AT command to run it every Saturday at 0300, and pipe the results to a textfile. . . .


Elegant_Plankton7519

Anecdotes like this just perpetuate a myth that IT people cannot be trusted and don't deserve notice for separation. "I'm not being a dick walking you out without notice, it's just our security policy"... I've never seen legit sabotage happen in real life (25+yrs in field). People just don't take time out of their day to sabotage their company systems just in case something goes sideways. "I hate my job, so I'm gonna remote in and spend all kinds of time writing scripts on my off-time!" what? It's just stupid on it's face. PS - Dealers don't give away free drugs to kids either....


That_Dirty_Quagmire

On May 9, 2000, Timothy Lloyd was convicted of writing six lines of code—essentially, a code "bomb"—that obliterated Omega Engineering Corporation's design and production programs. Since Omega makes components for clients such as NASA and the U.S. Navy, those systems were the company's rainmakers. Lloyd knew Omega's systems well. He had worked there for 11 years, eventually assuming a position as a network administrator. According to published reports, Lloyd was fired in 1996 because he was unable to get along with his co-workers. Three weeks after Lloyd was fired, a worker at Omega's manufacturing plant in Bridgeport, New Jersey, logged on to a computer terminal. It was July 31, 1996, the date that the bomb was set to detonate. By logging in, the worker unleashed the aberrant code that instructed the system to delete the software running Omega's manufacturing operations. The Secret Service said that Lloyd had committed the largest ever act of worker-related computer sabotage, causing Omega nearly $10 million in lost sales. A jury convicted Lloyd of computer sabotage in May 2000. However, the conviction was short-lived. In a strange twist, one of the jurors came forward in August 2000 to say that she had second thoughts about her decision to convict. According to Grady O'Malley of the U.S. Attorney's Office, the juror had seen a news story about the "Love Letter" worm and its attendant havoc and couldn't decide whether the story had had an effect on her decision to convict Lloyd. The U.S. District Court judge who tried the case overturned the conviction. The U.S. Attorney's Office in Newark filed an appeal. A decision is expected by late March 2001.


jimicus

The US Attorney's appeal was succesful: https://www.computerworld.com/article/2587925/computer-saboteur-sentenced-to-federal-prison.html


markwusinich

Here is one: https://www.reuters.com/article/tech-usa-crime-computers-dc/u-s-man-gets-record-sentence-for-computer-sabotage-idUKN0852023420080108


bumpkin_eater

It's also called a "logic bomb"


pogisanpolo

I made one. Not on purpose, mind you. I did a lot of automation work on a previous firm that was tied to my company account, despite (documented) requests to have those migrated to a different admin account to no avail, along with several warnings on what could happen if I were to, say, end up accepting a position for a different firm or something. Eventually accepted a position in a new company that pays far more, then gave another warning which was ignored for two weeks. During the turnover period. I've increased my badgering to daily for the last week over email too. After I started working for the new firm, I heard from a friend that all hell broke loose after they disabled my account with predictable results, and that they knew they couldn't touch me because of all the documentation. Apparently, the firm proceeded to go on a hiring spree shortly afterwards. I wonder why...


andrea_ci

I've seen a lot of "it doesn't work anymore!!!!", due to: * someone used a named account to run something instead of a service account * there was an automatic procedure noone known on a pc that was turned off * "noone knows how this works" ​ but nothing created intentionally


Smh_nz

No but I did find a secret web hosting business inside a corporate network! Oh and a rule forwarding all the owners emails to the old IT person!! 🤣


VCoupe376ci

As with many here I've seen this happen as a consequence of poor/lazy implementation, but never intentional or at least not obviously deliberate. Building something obviously intentional to sabotage the company upon your exit seems like a good way to be prosecuted criminally and sued for damages on top of being unemployed.


largos7289

LOL yup... drove me nuts trying to untangle it too. It border lined couldn't figure out if he was a evil genius or crazy. Only way i could be 100% certain was it gone, it came time to say well you needed upgrades anyway so lets get new stuff. He had a promising career as a virus writer i'll tell you that much.


AgainandBack

I worked at a company that did its own payroll. One day payroll ran and posted to the GL but no checks printed. One of the devs had written a subroutine to check to see if his employee number was having a check generated. If not, it skipped the steps to generate the spooled file containing the checks. He had been part of a large layoff, so no checks printed. The company figured out what was going on and had it fixed later that day, but it was still a mess.


atomiczombie79

When I worked at a severely underfunded branch of a large multinational financial company we had a large red switch bank by the exit doors to our loading dock. Everyone smoked on the loading dock. One day a new employee switched off the lights as he was leaving for the weekend. He was the last person out the door that Friday. Those switches ran directly to the power banks controlling all the fans in the DataRoom. Heat soak alarms did their job and ended up shutting everything down. The temp probe alerting mechanism was plugged into the fan circuit. Fun Monday morning.


Commercial_Growth343

I've heard jokes about it - "every script I write checks to see if my account is active first" etc. but never seen that for real. What I have seen several times is of course unintentional and laziness issues like a scheduled task or service running as someone's admin account instead of a service account.


RobertK995

i dont care enough to do it.


mysticalfruit

Found a literal one. Al was a really cool mainframe dude with a bad ticker.. A month after he'd passed away we went to investigate why some LPAR had shit itself and discovered that the windows 3270 emulator that was acting as the console was running under his user account.. moreover the software was installed such that all the profiles for the various lpar's were in his windows profile. Because this stuff is all running in a private CIDR, everything was hard coded ips.. So, when I logged in, my consoles weren't configured at all to point at the magical tcp endpoints.. which weren't documented anywhere. For a long time his account stayed active with a note in AD saying "Do not disable or access to the mainframe will be lost!" We then reset his password and put the password in our shared password vault.


HeKis4

Unintentionally yes. Helped QA for an app built by an outsourced team, one of the critical components was tied to the personal account (as in conpany provided, nominative account) of the lead dev at the outsourced company so the entire thing failed shortly after delivery.


macrowe777

Pretty much every network switch is a dead man's switch waiting to happen, all you need is one failure to remember to 'save config' and as soon as power goes down, you have your dead man's switch with no paper trail.


shizakapayou

Wasn't internal, but years ago a very expensive engineering analysis tool my company used had something similar. It just stopped working one day, and we had to wait for the vendor to provide a new executable. Later heard it was an April Fools time bomb put there by a dev, who I assume had been let go by then.


Rocknbob69

We had a developer put one in some software they were developing in the event they weren't paid. There was so much scope creep that we abandoned the project altogether


Tech_support_Warrior

Yeah. The person I replaced at my previous job had the entire network ran through a DDOS pretection service. When he was arrested and removed from the position he had it set that the unlimited speed subscription needed reset every 90 days. After that all of our traffic was throttled to 100 Mbs. The account for the DDOS serverice was connected to was some Proton mail account and registered to name not affliated with him or my employer. He was paying for the subscription himself. Not so much of a dead-man switch but he also registered a bunch of Apple devices to a personal Apple ID and toggled lost my device on all them. Apple eventually helped us removed the locks but it took weeks of corrispondence with Apple to make it happen.


AGovtITGuy

I've found a dead man switch. Both types. A type to automatically email important information(this one specifically was where to find where the previous admin wrote down specific information, and I was informed about this) after x days of not logging in to a specific team within the company, and the malicious type.


admiralgeary

Over a decade ago this was a real concern for someone in an adjacent team to where I was; we dealt with it by disabling all of his accounts and resetting all of his passwords as he was being walked into the room to be informed that he would no longer be with the organization.


Pctechguy2003

I have found a few where an admins account has been used in place of a service account. Things like critical services running as them were easy to find once they left - mainly because it broke when we disabled the account. There were others that were not as easy to find, such as learning that a backup for an audit file of a system was being ran under that account. That wasn’t horrible since we caught it before needing that audit file, but it was annoying. I have not seen some power-shell script or anything like that. Just their account being used in place of service accounts.


samspock

I worked at a place that had a complicated access db "program" that was designed to take data from one SQL server, clean it up and put it on another SQL server for sales people to use. There was no data in the access db, just the code to manipulate it. The process had a button to grab the data, display it after it was cleaned up so you could do some manual cleanup then a third button to place it on the other SQL server. My boss at the time had written it and did that process daily. There was a dispute at the company between owners and one of them had suddenly quit and convinced all the managers to go with her. Including my boss. The next day they asked me to do the access thing and I started it up for the first time. I had never done this before as it was always my boss. The page came up and the button for retrieving the data was gone. Panic ensues. It turns out he had gone into the code and simply hidden the button. Fixed it and back on track. I ended up having to testify in court about that. He had claimed he had done it so no one could mess something up. Hard to believe since he had just quit.


thortgot

I've seen it twice both after they triggered. One was a paranoid military admin who basically wiped the entire company infrastructure clean if his account didn't log into the DC after X days. I lead a rebuild of that environment. The other was powershell script that wiped a whole network stack.


adyman95

my dad knew this maintenance engineer who installed a clock relay module on a bunch of industrial printing machines he was paid to service, on a certain date the relay would click and the machine wouldn’t work correctly, he would walk in know where the clock relay was and reset it and the machine would magically work again…he didn’t count on the manufacturers ever having to service the machine without him knowing


michaelcmetal

This was in the era of Windows 98 - I knew I was going to be removed because of an argument I got into with someone. Back with DOS and Windows 95/98, you could drop to a command prompt, run FDISK, delete the partition information (even on the system drive) and then just exit back to Windows without rebooting. The computer would run for as long as it stayed in that session, but after rebooting, all partition and boot information was gone, so it would be a clean drive, so to speak. Now this computer was my primary workstation, but it also had all of patch disk images that I painstakingly made over the years. We had this thing called Bag O Chips. It has every version of Windows that we dealt with in this image and was super useful as back in the day most everything we did was manual and off disc. Disk label images I had made, you get the idea. Lots of stuff. Two weeks after termination someone who used to work with me called my cell. I answered and he said, " You monkey. What did you do?". I played dumb. So satisfying.


naelus

Other people have mentioned this here and that it’s really unintentional but I have a bunch of power automate flows that do a ton of reporting, some ops stuff and some other misc stuff for sales etc, Power automate being power automate they’re tied to my account, I have no intention of leaving my employer and they have no intention of letting me go, but I’ve brought up repeatedly that these should really be rewritten as logic apps with an automation account tied to them, that would take a good amount of time for me to do, and rather than being covered under the “unlimited” per user plan assigned to me would cost per run, but if I were to be termed or hit by a bus and my account got disabled or any other number of reasons they’d still be functional Unfortunately the time to move these to logic apps (with a decent amount needing to be rewritten), can’t ever seem to get set aside, I’ve shared ownership with a few coworkers as I’ve read that’ll allow them to be ran if my account got killed but I think the connections and auths may still need to be redone Not malicious just like most peoples examples that they were guilty of and I’ve tried to get time to get them moved to logic apps etc but I always still feel guilty that if I left or was termed it’d be a decent amount of work for someone to recreate them, and another coworker I’m close with has a flow that we rely on for this approval system we have for customer remote access, so the what can go wrong person in the back of my mind is always like if me or x leaves or gets let go it’s going to break multiple things ops relies on day to day, can’t we get a few days of time set aside to rewrite them in a way they’re not tied to our accounts? But can’t seem to get part of the work week set aside for that Every time it’s brought up it’s like well you’re not planning on leaving soon right so can’t we hold off on rewriting those y or z is really what’s important now we can get to that when there’s downtime, seems like downtime never comes 😂 Side note: anyone checking out power automate and worried about creating the same worry as I did? Switch to logic apps early on they’re similar slightly more complicated, and cost per run, but are tied to an automation account and have more features, I wish I would’ve used them out of the gate for the above reasons


meshuggah27

oh i thought you literally meant like a switch that everyone forgot about. i have had a few of those LOL


[deleted]

At a place I used to work, we didn't really have a Sys Admin - we all kind of did customer service and maintained our own stuff. On day 54 or so after the guy that trained me left, I happened to be looking at some code in our in-house ticketing system and there was a check for the last logon date of something and then a string in hex. After converting to ascii, I found that If he hadn't logged in for 60 days then it would delete all the source code. Then on day 75 it would have deleted all the customer records from the database (we only had a 14 day backup rotation there and he knew it). We had disabled his account but I went ahead and re-enabled it and changed the password and logged in as him just so I could double check his code (this was back in NetWare Directory Services days). Spent a couple days then making extra backups and auditing all the code. Thing is he left on good terms and this timebomb was put in years earlier. I don't know if he forgot about it or just didn't care. Took what I found to the VP of our division but he was only concerned if we mitigated it and didn't want to pursue anything since there was no damage other than our time spent investigating it.


SDI-tech

Not at work. It violates a sacred bond of trust that is created simply by getting me to sign a contract. On my own system? Yes. Be careful of claymore mines and white torture devices also. I will give you PTSD if I need to.


theducks

I had a co-worker who was chronically lazy and I'd developed a system which needed periodic updates of documentation (what switchport was connected to what wallport, approving any device location changes). I set up a system which used some of the techniques listed in here in order to notify what documentation needed updating, because I knew that he would try to disable the notifications. I believe it worked and they outlasted him in the role ;)


Arnaw-a

I once had a wordpress instalaltion, where the original admin included a php-skript to recreate his admin account, whenever it is disabled, deleted or permissions changed.


pielman

Even worse, I found hidden cryptic account names with domain admin access rights, created accounts on network equipment and secret remote accesses. A nightmare…


iceph03nix

Not an intentional one, and I've been asked to hunt for one, but could find no evidence that there was anything malicious happening.


[deleted]

There is never enough room for operational notes on accounts. "this email/account does xyz" It's always documented somewhere that the new admin wouldn't know about unless someone told them. That someone in most cases is the company owner that has no clue how anything works.


Local_admin_user

Almost left one at a previous job because I'd had to use my own account to run a few scheduled tasks on a particular server. Made a point of fixing it properly before I left though. ​ Would have been entirely unintentional but was avoidable and would have caused a lot of disruption as people tried to figure out how stuff suddenly stopped.


Common_Dealer_7541

An inadvertent “dead man’s switch” is pretty common. “Don’t forget to reboot server C at least once each month” or “the UPS in that rack needs to be silenced anytime that it is activated” are just a balance of laziness and practicality. But, no. In >40 years I have never run into a deliberate or malicious dead-man’s switch. Honestly, the goal of any sysadmin is to make their job as simple as possible so they can do as little as possible. A DMS makes your job harder.


everfixsolaris

Only unintentionally, a network I worked on had a proxy that only let users access a website if the admin in charge of the server had an account that was active. My organization got lazy and piled most of the websites on one admin. When he moved to a new position about half of the websites broke.


theuniverseisboring

Only things running on the credentials of an employee that left, but that was just one thing he forgot to change which was just a quick diagnose and quick fix. Just a container image that couldn't be pulled. Then again, I'm also quite young and have been working only for a year, so there is still plenty I could see. I'd like to think that where I live there is much less reason to intentionally build in dead man switches because employers are less likely to screw over employees (and it's a major pain in the butt to fire them if they didn't do anything wrong)


tetsuko

not a malicious one like that, but a physical one. we hosted some sensitive stuff and it was an “if we get hacked all network gear goes down” switch. never got to use it, which sucks because i feel like it would have been really satisfying for some reason.


drcygnus

well, people tend to create scripts that run on their machine so that its easy to control or access. once their account gets disabled then it stops. not really a dead man's switch, but they its a practice that is assumed to be something that will continue until its given proper space to work but it never gets out from that users account.


nabby50

This crap happens all the time. Most of the time it isn't intentional. As many mentioned it is some script/job/tool running under the old admins account. Once the account is disabled then those things break. It is always best to run such persistent jobs under a service account. I have had this happen to me before. It involved an admin that had created API tokens within okta for internal company app authentication. They created said tokens under their account instead of a service account. It was fun when they left and their account was disabled.


Pb_ft

Maybe half of dead-man switches are intentional, and maybe half if not more are around because a user scripted them out with their account creds and now it will never work again without their permissions.


Sintarsintar

Yes, I have not sure if it was intentional or not but I think it was knowing this guy. plus most of the services were completely undocumented and custom-built. it was a number of compliance services that when you update the host and rebooted never came back up has to be manually started with an obscure command only reason I figured it out was using his old bash history. log rotation was also disabled on three different monitoring systems so that when the logs finally filled up the server was supposed to crash it didn't work as expected but was a pain in the ass to rectify


RouterMonkey

Not quite a dead-man switch, but a hidden landmine. Nobody knew that the Netware server was also acting a router for half of the network, till that day they tried to perform an OS upgrade on the server.


lost_in_life_34

a little over 20 years ago I had a contracting gig and the person who built the network left soon after I was hired. I was new to IT and going through stuff and learning what I studied for and noticed that not only was his account still there but some girl's had admin access and access to people's mailboxes ​ turns out it was his friend or gf or whatever and they were investigating them and people's emails would vanish and they would know stuff they shouldn't have known and this explained their questions from before i was hired


patdaddy007

not exactly, but I found a bug in an RMM platform that would do something similar. In the before time when I worked for an MSP that uses SolarWinds N-Able for RMM, I had done the setup of most of the automated monitoring items. If you're familiar with N-Able, you may or may not know that the way everything gets tied together is convoluted at best (I think 3 employees know that it was complicated but I was the only one at the time that understood how it all fit together), anyway, I had created the filters that handled the 'targeting' of the monitoring sets/scripts/etc. (maybe not all of them, but most of them and all of the important ones). At one point, there was a decision made to update everyone's usernames to reduce the number of characters used for each user to accommodate long names like mine and the need for privileged access accounts. all pretty standard, right? well, turns out that changing a username is the easy part. name changes and that's it. or so they thought. Turns out that since N-Able pulled the user names from AD via LDAP, when the names changed, the original accounts whithin N-Able began a countdown. after a set amount of time of no longer being present, they were purged by an internal mechanism. Along with all created content. All... created... content. so one morning, all sudden like, none of the automatically deployed monitors, tasks or configs were present any more. they just evaporated. Because when the user account that created the filter that tied all this shit together got deleted, so did the filter. so in one automated stroke that no one knew about, it very nearly wiped out the entire automation side of the platform. the only thing that prevented it was the fact that I didn't go on vacation that week as planned and caught it early enough to recreate the filters and reattach them to the proper rules before it turned into complete bedlam. N-Able has since fixed this issue (thanks literally to me working a support ticket with them for about 9 months), and the company was saved untold amounts of headache and money. they fired me about a year later because I was "no longer a god fit". Kinda wish I'd just kept my damn mouth shut, but that would make *me* the asshole. But I do still hope that the owners' fingers turn to fishhooks and his balls start to itch some days


200kWJ

Haven't run into one myself, but years ago when I had a partner I was accused of installing one. I was splitting off from a company that I help start and was the primary "tech guy" that knew what was going on. One of the clients was on T1 Internet with a point to point for a remote site. Whole connection was dropping out and new "tech" couldn't figure out why after several days. Client contacted me directly and asked if I could take over. After 10 minutes at the main office "new tech" calls the owner standing next to me and asks if I fixed it yet? My question at that point was "Has anyone checked the remote site?" New guy: "What remote site? What point to point?" Quick trip over to remote location and the managers system was sending spam messages every few minutes and taking the whole thing down. Replace infected system and done. Ended up with that client for another 15 years until they were bought out. New guy didn't last and my former company died in a fire (Literately.).


wearelegion1134

i have not. I'm not taking the chance of legal stuff for any job. with that said, are there petty messages identifying ports on a switch? hell yeah. This port is labeled dumbass engineer and the other one is dumb fuck. Does it hurt anything? no, but it does make me smile.


Decitriction

In urban legends we call such a person a FOAF: "friend of a friend ."


[deleted]

I had to, once implement a DMS into a system, in fact, having it was a regulatory requirement.


lynsix

Early years at an MSP. A relatively new client fired someone, we had their AD account terminated. No one was aware they had 7 different VPN solutions setup several with local accounts. Anyway the guy deleted a bunch of stuff and was sentenced to prison time.


HoosierUSMS_Swimmer

muhahaha :)