1.) Always show up with solutions, not problems.
With good reason, Rule #1 is the first thing to always remember. You never want to be the person who impedes development or release. You always want to be the person who improves development and release. By the nature of you job, you will be the person who often needs to deliver the unwanted bad news, but if you gain a reputation for delivering the bad news with some good news, like ways around a problem, then people will seldom dread talking to you. Take responsibility for improving the product, offer fixes, and still offer fixes when it's not your job to fix the bugs.
Never utter the words "show stopper." Even if you are right, it may not be your decision, and you can easily make enemies of the people you need to work with every day. Instead educate others on the potential harm to customers, potential loss of customers, and if that doesn't work, the potential harm to the viability of the company as an on-going venture. When you need to do this, and you will need to do it, be prepared with multiple paths for working around or fixing the issue along with a cost estimate for fixing the issue. In other words, lead others into uttering the fateful words "show stopper." Surprisingly, you job is to make the risks and consequences known, as well as provide alternatives to mitigate the risks and avoid the consequences.
Pointing out mistakes is always a touchy situation. Many people react poorly to being told that something they did is wrong, so try to memorize the secret formula, "We can improve X by doing Y to avoid Z." The "we" is important and you can even toss a "probably" in there somewhere for added effect.
If you can reliably remember the secret formula even when your scalp is sore from pulling out all of your hair, then please tell me how. ;)
2.) You will always have constraints, so know and memorize them.
When an executive spends $200K on a piece of test equipment, and the other engineers want data at some super fast resolution beyond the capacity of the equipment, you are the person responsible for knowing the constraints of the test equipment. This has actually happened to me, and it's a whole lot of no-fun. You get stuck between a rock (the other engineers) and a bad place (the exec who doesn't want to look bad for buying the wrong/cheap equipment). Sure, it may seem anecdotal, but you'll be surprised how often you are asked to do the impossible.
When you know the constraints of your test system, then you can show up saying, "We can do X with what we have currently, or you can push upstream for more investment in test infrastructure. We try doing X to see if it will suffice for your needs?"
In your situation with web apps, particularly mobile-ready web apps, testing will require a big investment in test infrastructure. If your web app is using any of the newer direct-to-hardware (WebGL, AudioAPI, ...) features, just using emulation (system/browser images with VMware or similar) may not suffice to give proper test coverage. Emulated hardware is never perfect, so you'll often need access to real hardware. Of course, you can often do tons with just emulation, but knowing where emulation will fail in strange and unexpected ways means knowing the constraints of emulation.
Knowing your constraints is knowing what you can actually test. When others have unrealistic expectations, knowing your constraints puts you in charge of the negotiations... --And it's always a negotiation. If need be, keep a constraints cheat-sheet around with the details. You'll be surprised how often it comes in handy.
3.) Within your now known constraints, define the specific capacities you want to verify, and the means to verify them.
Once you know the desired capacities and the means to test them, the automation of verification becomes a whole lot easier. Your existing bug tracking database should be a good place to start for building up your regression tests, but preventing the reappearance of old bugs is only a small part of the problem. The majority of the problem is finding the never ending stream of new bugs.
Since browsers and operating systems are constantly changing, you never have a solid foundation. This means your test environment requires extremely strong versioning to manage the vast multitude of browser, OS, hardware and version combinations. It's important to realize how automatic updates are your sworn enemy. They will hose your test environment since if you don't know what you're running, then you don't know what you're testing. Of course you'll test the latest and greatest versions of everything, but each new update will be a new environment version that needs to be preserved.
Environment versioning is a real pain. The number of environments you'll need to manage and test will grow exponentially, so with the always limiting resource of time, you'll not only need to automate as much as possible, but you'll also need to do triage. Testing every possible combination is intractable.
Many sites will try to avoid the pain of environment versioning by only supporting the most current version combinations, but in doing so, they only avoid the expense of supporting "legacy" combinations. There will still be plenty of combinations of "newest" so their test infrastructure will undoubtedly fail to give proper coverage on some of them.
Virtual machines, emulation, simulation, system images (snapshots), and similar along with your favorite version control system (git, cvs, svn) will really help to manage the env versioning issues. It can be a ton of work to get things all set up the way you want them, but the benefit of being able to give good and repeatable coverage makes it worthwhile.
Personally, I'd go with open source as much as possible since it will use a normal programming language. It will mean recreating the wheel in some situations, but for me, it seems better than the alternatives. The generally known open source web app testing frameworks are Selenium [1], Watir [2], and RobotFramework [3]. If you hate real text editors and prefer to suffer from using an IDE, some IDE's have plug-ins like CubicTest [4] for Eclipse which can drive Selenium/Watir.
[1] http://www.seleniumhq.org/
[3] http://robotframework.org/
Vendor lock-in from proprietary test suites and proprietary languages can be immensely aggravating and expensive. You might have the budget for expensive proprietary test suites, or you might not, but if you ever need to do something out-of-scope or need to change vendors, then you'll be proverbially stuffed. Sadly, I don't know the proprietary tools at all, and worse, I'm against them without adequate information. Yes, I suffer from the usual "I can do it" bias and personality fault, namely valuing my time less than valuing my money. Your situation may be different.
There are a quite a few different "Software Test" and/or "Quality Assurance" groups and conferences around. As you might expect, they often favor proprietary test solutions since it's usually the vendors of said solutions footing the bill for the conferences. Even with the deck stacked in favor of the proprietary solutions, the groups/conferences can be a good place to learn. EuroStar is one common conference but there are many others if you look around. When looking up EuroStar, I found a writeup from last year that you might want to look over:
http://www.eurostarconferences.com/blog/2013/7/25/comparison...
If you've ever seen the musical called "The Wiz" or even the movie of the same name, try to memorize the lyrics of the song, "No Bad News". If you can learn to sing a reasonable A Cappella rendition of it, even better. It will definitely come in handy, and more often than you'll want to admit. ;)
http://www.youtube.com/watch?v=pQT-QFy5Nig
Good Luck!