Here's an interview with Daniel Lidar whose was the postdoc who first taught me quantum error correction (and more.) No, not that LIDAR!
Note to all you job seekers, even in your darkest hours know that you have friends out there who are working to change the abysmal state of quantum computing hiring:
I would also hope to see a wave of new faculty positions at US institutions for quantum computation theoreticians and experimentalists. We now have the first generation of students and postdocs trained in this field, many of whom are finding it very difficult to land faculty positions in the US, and are forced to seek such employment in other countries. This is most unfortunate, and I hope that US universities will reverse this trend.
Categories
- Log in to post comments
More like this
By way of Seeing the Forest, we note that at Miller-McCune, Beryl Lieff Benderly has a must-read story about the supposed shortage of scientists in the U.S. A while ago, I described the supposed shortage of scientists as a problem of incentives:
As long as financial 'engineering' is more lucrative…
Jonathan Katz's "Don't Become a Scientist" has bubbled to the surface again, turning up at P.P. Cook's Tangent Space a few days ago. I can't recall what, if anything, I said about this that last time it came around, but I'll make a few comments here, in light of the recent discussions about jobs in…
...if you're not a tenure-track PhD (and that will be most of you. Sorry). I'll have more to say about ScienceBlogling DrugMonkey's training post tomorrow, but one of the disturbing things in the comments of his post was the high numbers of people who viewed PhD training only in light of…
tags: book review, white-collar unemployment, job hunting, Bait and Switch, Barbara Ehrenreich
While I was flying back to NYC last weekend, I read (yet another) book about job hunting. This book detailed the obvious; that searching for a white-collar job is not as easy as you might think, as you'll…
You should qualify that by saying "the abysmal state of quantum computing hiring at research universities." I offered two quantum information/computing folks jobs two years ago and was turned down by both despite a recession and bad job market. And it's not like we're in podunk North Dakota. We're an hour north of Boston. I expect to be hiring again in a few years as we consider adding a master's program and, as a quantum information guy, I'd like to bring in someone in that field, but as a department chair I need to do what's best for the department. If I keep getting turned down I'll start looking elsewhere.
Frankly, the problem is that the community is still too clique-ish and insular and you won't change people's minds until you engage with them, both professionally and personally (I know someone who just recently had a bad experience with quantum information people).
The kinds of real-world problems driving theoretical computer science research deal with large datasets and distributed computation (e.g., Google and cell phones). Quantum computers are well-suited for neither. Quantum computers appear to be best at quantum simulation, which isn't a problem CS people have much experience with. For how long can a field be divorced from practice? For how long can a field generate big new ideas, and attract new talent without any tenure-track hiring? The exception, one sub-area where quantum computation has gained traction, is computational complexity, where they are used to manipulating many computational models, and where quantum computers are a very natural fit.
Jim,
Well, that partly depends on whether or not you believe that D-Wave's adiabatic chip is really quantum or not. If it is, then Google's already using quantum computing technology for pattern recognition since they're using D-Wave's chips to make StreetView even scarier than it already is (I've seen a demo).
@JimM: The adiabatic quantum optimization algorithms are natural fits to machine learning, which underlies pretty much all large dataset / distributed computing applications.
@Ian: Have you tried Goggles yet? :)
A quantum computer can't be run on a large dataset without putting the data in a quantum-accessible form. Doesn't this limit any applications? Perhaps quantum streaming algorithms could still work well.
@JimM: Yes this is a little counter-intuitive. The way we have been dealing with this is described here: http://arxiv.org/abs/0912.0779. At a high level, you need to find ways to segment all the data into small chunks that can serially be fed into the processor.
I left academia because of the lack of jobs after I got my PhD. I really didn't have any friends out there helping me.