Archive for: April, 2013

Lamar Smith ups the anti.

Apr 29 2013 Published by under [Biology&Environment]

Unfortunately I don't have time right now to do this justice, but after my post on Friday, I couldn't let Lamar Smith's new bill to remove peer review from NSF go unmentioned. In addition to this ridiculous bill that would undo non-medical science in this country, he's also requested reviews and PO comments for five particular grants he didn't like the title of. For someone with such an appalling voting record when it comes to science, suddenly Lamar thinks he can step up and do a little grant reviewing? The right wing naked anti-science agenda is really reaching the absurd. Clearly they are not even attempting to hide it anymore.

12 responses so far

Really, why shouldn't we leave US science in the hands of a conservative tea party lawyer?

Apr 26 2013 Published by under [Education&Careers]

Fresh on the heels of the Coburn amendment to the restoration of much of the NSF budget for 2012 that is likely to decimate political science funding in the near future, the Senate is trying to further limit NSF. Rep. Lamar Smith, Chair of the House of Representatives Committee on Science, Space, and Technology, is proposing to require all NSF funded research to justify how it will benefit the US population.

Set aside for a second that NSF funds fundamental science that is not required to have a direct application to human populations (I think we have a funding agency for that). The whole point of basic research is that it is foundational. In many cases the results of current studies funded by NSF may have massive human impacts down the road. For some, those impacts might be intentional or predicted, but for most they will not. The idea that we can easily predict the impact on society a priori is unicorn-riding fantasy.

But the bigger issue here is why we have SOPA-supporting, climate change skeptic tea party lawyer deciding how the NSF should decide on science funding priorities. Maybe it's an upgrade on his predecessor, Ralph Hall, but the absurdity of the Congressional Science Committee being populated by people who have zero background in science is disturbing, though nothing new. Maybe our situation isn't as bad as what is happening in Canada, but we need to be concerned when politicians think they know how to "improve" the way a science agency runs their business.

Smith's suggestion, made during the afternoon hearing, could signal yet another twist in the debate. It also suggests that his thinking had evolved in the 2 hours between hearings. Instead of confining himself to social science research, as he and his Republican colleagues had done during the morning hearing with Holdren, Smith focused on NSF's entire portfolio in his afternoon comments to acting NSF Director Cora Marrett and Dan Arvizu, chair of the National Science Board that oversees NSF.

"These questions are not easy," Smith said in his opening statement. "It requires recognition that we might be able to improve the process by which NSF makes its funding decisions."

Later in the hearing, Smith made the case for a new yardstick with which to measure an NSF grant that would focus on its likely contribution to "the national interest." Turning to Arvizu, he said, "If there's a way to improve the process by which NSF makes its awards, I assume that you'd support it."

If, in fact, the Science Committee splits NSF out of the COMPETES Act and into a separate bill, it'll make pretty clear that their intention is to tie the agencies hands as much as possible. This is not about making decisions about the national funding objective, it's about people who don't understand science trying to force money away from anything they fear. I think the president's Science Advisor, John Holdren, summed it up pretty well, when he said "I think it's a dangerous thing for Congress, or anybody else, to be trying to specify in detail what types of fundamental research NSF should be funding."

10 responses so far

Words of wisdom

Apr 25 2013 Published by under [Life Trajectories]

No responses yet

Recovery

Apr 25 2013 Published by under [Education&Careers]

This week's free single on Itunes.

No responses yet

The writer's shuffle

Apr 24 2013 Published by under [Education&Careers]

There's only so many words one can write in a day. Maybe that's part of the reason my blog posts are normally just a few paragraphs, but many of my writing hours in the last few years have been burned up on grant proposals. There's always another opportunity or pot of money that looks accessible to try for so that you can keep the lab wheels turning. Crap funding rates drive more proposals and so the cycle marches forth.

The cost of the proposal churn is our publication rate*. I'm not happy with it. The blame for the glacial pace falls squarely on my desk, too. I am the limiting reagent. We have some manuscripts moving through the system right now and a few more working their way there. But damn.

The more people I talk to in my cohort who are facing a tenure decision soon, the more I see a dichotomy of time spent. Most people either lament their publication or funding record as they reach this stage. We've made our choices on how time was spent and I have no idea if one is better than another. Sometimes you choose a strategy and sometimes you have less control over how that ball bounces.

Ironically, I know for certain that I've been invited for collaborations, conference and seminar talks based on what we've had in review. At one point I was asked to come give a talk on a topic we had zero publications on, but two proposals in review about. I guess that speaks to the level of preliminary data required these days, but also to the possibility that proposal review in a bit on an "online early" for ideas. Yet another reason to get involved in the review process.

*Yes, I realize this was specifically cited as a reason for NSF cutting back the proposal writing and reviewing load on the community. It's a benefit of the new system, even if I wish we didn't have to go to 1 cycle a year.

14 responses so far

DEB and IOS: differences in the review process

Apr 22 2013 Published by under [Education&Careers]

Before I had the opportunity to get involved with the review process at NSF, my impression was that it was standardized across the board. At least across the BIO directorate it had to be, right? Well, yes and no.

Obviously there are core aspects of review that remain the same, but there are some subtle differences as well.

Triage: DEB panels are somewhat aggressive about triaging proposals that panelists universally disliked. If you can't get three people to give a proposal more than a "Good" each, then there is no discussion about that proposal. The PI will get the three reviews, but no panel summary. IOS maintains that every proposal should be discussed, so you will always get a panel summary back from your IOS submission. I can see the advantages to both and don't think one is better than the other approach.

One unintentional consequence of this, however, is the deadline for reviews. In DEB they try very hard to get all reviews ahead of the panel in order to make the triage list and determine how many proposals are going to be discussed. For an IOS panel, they want the reviews ahead of panel, but it doesn't always work out that way. Whereas they are required to be filed ahead of the summary, they are not required prior to panel. That is not to say that these reviews were done any differently, just later.

{UPDATE: This policy may have been revised in DEB this year. See comments}

Caucusing: DEB allows it, IOS does not. The idea is simply that the reviewers discuss the proposal ahead of table discussion to see if they can work out differences. It makes the group process more efficient, but also has the potential to obscure some details from the rest of the group who might otherwise weigh in.

Previous panel summaries: On a DEB panel the reviewers are given access to previous panel summaries if a proposal is a resubmit. With funding rates as low as they are, the majority of proposals fit that category. This access allows reviewers to determine whether the PIs took previous panel advice (good or bad). IME, if there is relevant information along these lines, the POs in IOS will convey the information to the panel in discussion.

Rating: Based on panels I have been on, DEB and IOS use their categories differently, which should be considered when you read the opening review statement. DEB tends to toss a much higher percentage of proposals in "Not Competitive". In their terminology, this equates to the particular proposal not having a shot against others received. IOS, OTOH, uses the NC designation sparingly to mean that you should basically pursue other ideas. This may vary a bit between panels, but the opening review statement (where percentages in each category are reported) should be a good indication of how the panel used each designation.

As one might expect, however, the process is generally pretty similar between IOS and DEB. The science being proposed in each division is excellent and the bar is high for even the preproposals. The target invite rate seems to be in the 20-30% range at the preproposal stage, significantly winnowing the field.

9 responses so far

DEB looking for your input

Apr 18 2013 Published by under [Education&Careers]

DEB is soliciting feedback from the community on the effectiveness of the preproposals. There survey (http://www.surveymonkey.com/s/DEB_IOS_program_feedback) is applicable to anyone who has applied to DEB or IOS in the last two rounds, or those who intend to apply in the next two years. As many of you know, NSF is generally constrained by federal regulations from requesting general feedback on their process, so this is a good opportunity to weigh in.

No responses yet

Grant rant 2

Apr 15 2013 Published by under [Education&Careers]

Readability. Possibly the single most important proposal feature. I'll take a clear proposal on decent science over a convoluted proposal on groundbreaking work, 7 days a week and twice on Sunday.

Especially for NSF preproposals, if you are not writing for a tired reviewer who is not in your field, you are doing it wrong. Remember, there are no ad hocs on these proposals, so unless you are working on an obvious model in a big field, chances are many or most of the reviewers won't have a background in what you are proposing. This differs, to a degree, from full proposals. However, readability is still critical.

Before you submit, have someone outside your field read your proposal, asking them specifically to comment on their understanding of it. I can't tell you how many proposals I have read with 5 line sentences that take the reader on a tortious voyage. No matter how eloquent you think you are being with these long-winded soliloquies, I beg you to reconsider.

6 responses so far

The overhead "problem"

Apr 11 2013 Published by under [Education&Careers]

Everyone is looking for ways to keep research money on the table for the science these days. I watched the end of the OSTP budget request presentations yesterday and caught both the NSF and NOAA announcements. In both cases, and in the breakout sessions, each agency made a point to mention that only ~6% and ~4%, respectively, goes to administrative costs in their budget. The subtext, of course, being that they channel almost every tax dollar they can directly into the science.

But. That money for science doesn't all go to science, does it? In any discussion of declining science budgets, the issue of university overhead is always a topic that comes up. Each institution negotiates its own overhead rate with the Feds, mine happens to be just north of 50%. This means that for ever dollar I bring in, a little more than 50 cents is deposited into the university.

Fascinatingly (to me, anyway), this number manifests very differently between NIH and NSF. NIH budgets take only the direct costs (those $$ spendable by the PI) into account when the project is reviewed. For NSF, reviewers get the total budget, with both direct and indirect costs factored in. Theoretically, proposals with higher overhead rates can be penalized in the NSF case, whereas these costs are obscured from review in the case of NIH.

That little tidbit aside, overhead money ALWAYS comes up in conversations about federal research budgets. "Reduce the overhead rate and we'll all get funded!" While that all sounds well and good, the conjured images of overhead-funded Dean's Maseratis is so ridiculous that it's hard to engage anyone spouting that view with a straight face. So what does overhead really do?

Among many things, overhead has two major functions: 1) pay for the research enterprise, 2) fund start-up packages. Point 2 is pretty straight forward - a research career isn't going to get off the ground without funds to create data prior to the first grants rolling in. The first point, however, is where many people seem to have a blind spot. It's a blind spot often created by not having to worry about the costs that this evil overhead covers. Also, it is often unclear how much it actually costs to keep the lights on and the research supported. If we ignore support staff who are funded off overhead money (a major cost) and just focus on the lab space, what does that cost?

Ironically, crowd funding crusader Ethan Perlstein, is providing the answer. As Drugmonkey highlighted, Ethan is reporting a market cost of $900/bench/mo. And that's just an empty bench.

How many benches does your lab occupy? How about equipment use (because, recall, the crowd funding movement comes without start-up funds)? Service contracts? I can only speak from my experience, but overhead not only covers the costs of my bench space and all that is associated with that, but certain costs that I can't use fed money for. Software? Computers (in most cases)? Service contracts for equipment in my lab can run north of $20K annually (which is relatively small) and luckily we get much of that covered from college overhead. And we're not even talking about specialized bench space (laminar flow hoods?) or activities that need to be conducted in physically separated spaces.

This is the reason that I can't take people seriously when they claim that the current academic model is just a way for universities to gorge themselves off the federal government. These proclamations are often tightly coupled with a flagrant lack of appreciation for what the infrastructure of the research enterprise actually costs. Maybe you can convince the general public to donate $20,000 for a project. Maybe. But it better get done fast if you're bleeding out at a base cost rate of $900/mo before you pay for staff, reagents, equipment or computers.

Research doesn't happen in a vacuum. Without an appreciation for the actual cost, and not just what comes out of the direct costs budget, this independent research ideal is hamstrung before the race even starts. Making improvements to the way we do research is an admirable goal, but strutting about like only morons can't see your new clothes already has a story.

43 responses so far

Grant rant

Apr 09 2013 Published by under [Education&Careers]

Don't use numbered references in a grant proposal. Just don't. Save the space elsewhere, not every word you write is danced upon by angels. Cut a couple sentences and get the (Author year) refs back in there. Because I'm not going to the ref section to look these up and I'm not going to get a feel for YOUR contribution to this field nearly as well if you're not citing your lab's stuff in the text.

28 responses so far

Older posts »