So I work on the maintenance team for the development department of my company. My job is basically to fix bugs that come up and to implement any minor feature requests that do not represent any major new functionality. In a nutshell, I deall with all the day-to-day maintenance stuff so that the rest of the team can focus on longer term issues like integrating new components into the system.
One of the major efforts in the past several months has been to add a Content Management Server (CMS) to the mix, something for which I have not been involved in any way. About two months ago the CMS team checked in changes that effectively broke the application in the development environment. You would go to log in, it would appear to accept your login credentials with no error, and then instead of being taken to the home page for the application you would be kicked back to the login screen. No matter what you did you could not log onto the system, thus rendering the entire web application inaccesible.
This, of course, completely blocked be from getting any work done. I made some noise about it, and those new changes were rolled back while the CMS team fixed the problem. At least, I assumed that what they were doing was fixing the problem. A day or two later there was a new checkin of CMS code, and the login worked properly in the dev environment. End of story, right?
Right.
So about a week and a half ago I was asked to work on a bug indirectly related to CMS. The CMS team had their hands full working on other things, and I had the bandwidth to look at it. Something about CMS was interfering with another feature on the site, a feature that gets used by our tech support team on a daily basis. I started looking into it, and discovered that the "fix" for the earlier problem was simply to unwire CMS from the dev environment. The single biggest new feature set of our impending 4.0 release, and it was not even turned on except in the formal test environment. This meant that 1) I didn't have anyway to troubleshoot the problem I was tasked to fix, and more importantly 2) for weeks developers outside of the CMS team have been writing code without having any clue how it might interact with CMS. Given that one of the functions of CMS was to handle authorization credentials for the site, virtually any page on the site could potentially have unexpected interactions and I as a developer would never know about it until someone in QA stumbled across it in test.
I was not happy. In fact, I made a pretty huge stink about it. All I wanted from the CMS team was specific instructions as to what had to be done in order for CMS to be turned on and work properly in the dev environment. They were utterly unable to answer that question, which worried me even more. They clearly had no idea what the actual problem was, and their general attitude seemed to be that since it worked on their machine and it worked in test, there was no problem.
I spent most of the day the friday before last troubleshooting the login issue, and eventually identified the root problem and came up with a fix for everything to work properly in dev. I checked in that fix, shared it with the CMS team, and with the dev team as a whole. End of story, right?
Right.
So last night was supposed to be the deployment to production for our 4.0 release. Guess what? As soon as it was deployed, it was impossible to log into the site. The exact same symptoms. The whole thing had to be rolled back, everybody has to stop whatever they are working on today, and we need to figure out how to fix the problem. Except that as of right now, nobody seems to be in the office yet besides QA and upper management. This is such a huge black eye for the dev department I can't even begin to describe it, and the whole damn department seems to collectively have their pants down around their ankles.
Grr.
If the CMS team had actually worked the problem two months ago when it came up, I wouldn't be in such a cranky mood today.
One of the major efforts in the past several months has been to add a Content Management Server (CMS) to the mix, something for which I have not been involved in any way. About two months ago the CMS team checked in changes that effectively broke the application in the development environment. You would go to log in, it would appear to accept your login credentials with no error, and then instead of being taken to the home page for the application you would be kicked back to the login screen. No matter what you did you could not log onto the system, thus rendering the entire web application inaccesible.
This, of course, completely blocked be from getting any work done. I made some noise about it, and those new changes were rolled back while the CMS team fixed the problem. At least, I assumed that what they were doing was fixing the problem. A day or two later there was a new checkin of CMS code, and the login worked properly in the dev environment. End of story, right?
Right.
So about a week and a half ago I was asked to work on a bug indirectly related to CMS. The CMS team had their hands full working on other things, and I had the bandwidth to look at it. Something about CMS was interfering with another feature on the site, a feature that gets used by our tech support team on a daily basis. I started looking into it, and discovered that the "fix" for the earlier problem was simply to unwire CMS from the dev environment. The single biggest new feature set of our impending 4.0 release, and it was not even turned on except in the formal test environment. This meant that 1) I didn't have anyway to troubleshoot the problem I was tasked to fix, and more importantly 2) for weeks developers outside of the CMS team have been writing code without having any clue how it might interact with CMS. Given that one of the functions of CMS was to handle authorization credentials for the site, virtually any page on the site could potentially have unexpected interactions and I as a developer would never know about it until someone in QA stumbled across it in test.
I was not happy. In fact, I made a pretty huge stink about it. All I wanted from the CMS team was specific instructions as to what had to be done in order for CMS to be turned on and work properly in the dev environment. They were utterly unable to answer that question, which worried me even more. They clearly had no idea what the actual problem was, and their general attitude seemed to be that since it worked on their machine and it worked in test, there was no problem.
I spent most of the day the friday before last troubleshooting the login issue, and eventually identified the root problem and came up with a fix for everything to work properly in dev. I checked in that fix, shared it with the CMS team, and with the dev team as a whole. End of story, right?
Right.
So last night was supposed to be the deployment to production for our 4.0 release. Guess what? As soon as it was deployed, it was impossible to log into the site. The exact same symptoms. The whole thing had to be rolled back, everybody has to stop whatever they are working on today, and we need to figure out how to fix the problem. Except that as of right now, nobody seems to be in the office yet besides QA and upper management. This is such a huge black eye for the dev department I can't even begin to describe it, and the whole damn department seems to collectively have their pants down around their ankles.
Grr.
If the CMS team had actually worked the problem two months ago when it came up, I wouldn't be in such a cranky mood today.