Archive for the '[Biology&Environment]' category

Lamar Smith ups the anti.

Apr 29 2013 Published by under [Biology&Environment]

Unfortunately I don't have time right now to do this justice, but after my post on Friday, I couldn't let Lamar Smith's new bill to remove peer review from NSF go unmentioned. In addition to this ridiculous bill that would undo non-medical science in this country, he's also requested reviews and PO comments for five particular grants he didn't like the title of. For someone with such an appalling voting record when it comes to science, suddenly Lamar thinks he can step up and do a little grant reviewing? The right wing naked anti-science agenda is really reaching the absurd. Clearly they are not even attempting to hide it anymore.

12 responses so far

NSF preproposal specific aims section

Apr 02 2013 Published by under [Biology&Environment]

The NSF preproposal is a fairly new document and people who submit to DEB and IOS are still trying to figure it out. I'm currently reading proposals for my second preproposal panel and some patterns are starting to emerge. In particular, how people handle the Specific Aims section 1) makes a big difference in the flow of the document, and 2) is pretty heavily correlated with those I suspect have NIH experience.

There seem to be three flavors of SA that I see re-occurring:

1. Just the facts

Some PIs are simply stating the aims of the project with no supporting text. Just two or three aims at the top of the document and then we'll tell you about the background.

2. The hybrid

Others are including a bit more than approach 1, but wrapping the whole thing up in half a page or less. There's some context, but it is mostly focused of fleshing out the Aims a bit.

3. The pager

These generally stick to the NIH format of taking a page to nail down wtf you are proposing to do. There's a certain format to these pages (one opinion here) and it should do a good job of summarizing the science of the proposal concisely.

I may be biased here, so take my opinion with a rock of salt, but I find option 3 to be far more readable a format. Option one is jarringly disjunct and option 2 never seems to do quite enough for my reviewing tastes. My guess is that some people feel that the third type of SA section is redundant with the project summary, but the summary is very specific and includes Broader Impacts. If you're sly, you can use the summary to include a few interesting tidbits before hammering your best stuff home in the SA page.

Like many novel NSF documents (see: postdoc mentoring plan, data management plan), it's going to take a few rounds before things settle in and people have a feel for what to expect. For those of you who have written preproposals, how did you handle the SA section? What are you reviewers seeing?

12 responses so far

Corporate Bio-knowledge FAIL

Nov 29 2012 Published by under [Biology&Environment]

Um, yeah. As much as "amphipod" sounds kinda like "amphibian", they are not actually the same thing. But the people at Amphipod Running Products probably didn't want their logo to look like this:

Pic from here

Although, I gotta say it's a lot more interesting than a frog.

7 responses so far

Thoughts on the ESA letter to NSF Bio

From multiple fronts I have recently been made aware of an effort that started at the Ecology Society of America to formally protest the new NSF Bio regulations on grant submissions. For those not aware, the short version of the changes includes a move from two annual cycles to one, the institution of a preproposal stage and a limit of two grants per PI in any capacity.

These changes were met with resistance and the general feeling that early career people were going to take it in the teeth. Only the DEB and IOS programs in Bio initially made these changes, but MCB recently let it be known that they were going to do the same after a year utilizing a different approach.

The ESA letter focuses on the concern that the new policy is going to significantly slow the pace of science in the Bio Directorate, an opinion I have written in the past. I'll deal with each of the points of the letter separately:

The process creates an exceedingly long lag between the time when ideas are first proposed and when funding becomes available to investigators. Even if the two preproposals allowed per Investigator per year are successful, it takes over one year from submission to funding (as opposed to 6-9 months in the former system). This lag time increases to over two years or longer if a preproposal is unsuccessful. The increased lag period comes at a time when the rapid pace of environmental change requires science-based solutions to address societal needs. It also hinders the development and deployment of new tools and technologies (e.g., molecular and informatic) that inform solutions that address such rapid environmental change. The long lag between idea generation and funding is particularly hard on junior scientists who are establishing their research programs, but also hinders progress of more senior scientists seeking to sustain active research programs and to educate the next generation of scientists.

It's hard to argue with this one, which basically summarizes several points that have become familiar refrain. When I served on the preproposal panel last spring, these exact points came up. The NSF party line* is basically that the extra time now built into the process is critical for PIs to incorporate the feedback they received from the previous round. The claim is that internal studies demonstrated that proposals that were turned around from one round to the next did not fair as well as those that took a round off. Unfortunately, there is no NSF equivalent of Rock Talk to show these data, so we can't pick it apart. We're left wondering whether these data are a game of averages or represent real trends. Nevertheless, from the perspective of a PI, the loss of one round per year is perceived as a significant development, no matter how NSF wants to spin it.

The process limits the scope of science by (1) selecting against complex, interdisciplinary science that cannot be convincingly described in four pages and (2) hindering collaboration among scientists (by limiting the number of submissions per investigator per year) at a time when research programs and teams need to be increasingly multifaceted, innovative, and interdisciplinary to address complex issues.

This is where I'm a little less enthusiastic about the approach. The first argument about complexity that can't be described in four pages is, to me, a little weak. Is it hard to encapsulate a complex project in four pages? Yes. But claiming my science is SO complex you just don't get it because I can't tell you all about it is not exactly winning me over. Make it work.

The second point is stronger. For a decade plus we have been told that interdisciplinary is the way to get things done. Multi-PI projects have been encouraged at every level, from federal to institutional, and suddenly it is a liability. Now we have to pick and choose what we can contribute to in order to stay within an arbitrary limit of proposals. Okaaaaaay.

The process limits feedback to scientists, slowing the pace at which creative ideas advance during the iterative submission-resubmission process, because of the lack of ad hoc reviews for proposals. Although investigators faced low rates of proposal success with the former process, it at least offered comprehensive feedback and allowed for relatively quick resubmission, increasing the chances for success with future submissions.

This is another point that I don't see eye to eye with. If you have ever served on a panel, you know that the opinions that really matter are in the room. Whereas the ad hoc reviews can be very informative to the panelists, they can just as easily be almost ignored. The POs will ask the lead reviewer to comment on the ad hocs, especially if they are at odds with the panelist opinions, but they do not carry the weight of the panelist reviews. Additionally, my experience in the preproposal panel suggested that NSF can get pretty good coverage of the vast majority of the proposals with the panelists, not unlike NIH's study sections. Besides, the PI still gets at least three reviews back and if it didn't get trashed in those reviews, a panel summary as well. Are the additional couple of ad hoc reviews really that important?

The delays, the reduced opportunities for collaborative proposals, and the more limited feedback are likely to have a disproportionate effect on young scientists and members of groups who are not yet well represented in our science. We fear that this new process will result in the loss of some very promising people from the pipeline who are already discouraged by bleak prospects for funding research.

This is a concern shared by NSF. When the Director of Bio came to speak to our panel in the spring, this was something that he cited as being at the top of the list of things they are monitoring. I have to say that most panelists were not necessarily comforted that the situation was being monitored, but it was made clear that the POs were being directed to keep their portfolios balanced as before. How the preproposal process is going to affect that is not entirely clear.

One rumor I have heard repeatedly, however, is that only established labs made it through the preproposal process. I can say from my experience as both a panelist and a preproposal PI, this is not true. I can understand the perception that those with more proposal writing experience were able to navigate the new process better, but we pushed many early career preproposals through in my panel and I had one accepted from my own lab as the sole PI. I don't buy the fear mongering.

The letter concludes with the following:

We are optimistic that thoughtful modifications of the new preproposal process, made in consultation with the ecological and environmental sciences community, will ensure that science progresses as rapidly as possible given the level of funds available, thereby providing maximum benefit to society. In any such modifications, we believe it is essential (1) to ease current restrictions that limit collaboration and the pursuit of high-risk, high- reward ideas and (2) to provide two deadlines per year, even if that requires taking other measures, such as reducing the number of ad hoc reviews or reducing proposal length, to ensure reasonable workloads for NSF staff and the reviewer community.

Here's my biggest problem with this letter. Where is the solution? It is very easy to say that the current system is not working and we need to make changes, but don't place that burden right back in the lap of NSF if you want to make changes! Make a PROPOSAL. I've made some very clear suggestions geared towards solving these exact issues, but I have seen almost no other concrete proposals made. I think this is where we, as a community, are doing ourselves the most harm. We need to hammer out what we want and how to implement it if we want to gain any traction, otherwise NSF can throw up their collective hands and say "this is the best we can do."

So will I be signing this petition? No. As much as it summarizes some of the concerns I share, I don't think it has much value without suggestions for real change. I think NSF was fully aware they would get push back from the community based on these changes and this type of letter is going to fall right into the "expected whining" camp. Until the community can present a meaningful document with realistic changes, there is no impetuous for NSF to do anything.

I see this as a labor negotiation. NSF changed the rules of "payment" and has left their "payees" upset with the new environment. Unless we can make a convincing case (including some concessions**) for change, we're going to have to live by the new rules.

*Make no mistake, there are wildly different opinions on the changes among the Program Officers.

**Sorry folks, but NSF believed strongly that the system was nonsustainable. If you want to convince them to change you need to make concessions that address the issues.

16 responses so far

Sea slugs stealing more than plastids

Jun 14 2012 Published by under [Biology&Environment]
I've been fascinated by the story of the Elysia sea slugs for some time and have blogged about it before. In 2010 I covered the first paper with decent molecular evidence of a gene transfer from an alga to a sea slug and a follow up study. Earlier this year a number of papers came out with some new twists and recently I saw another chapter in story by Pierce et al (2012) has made it to print.

Elysia chlorotica (source)

If this story is unfamiliar you can read all the background in the two previous posts linked above, but this system appears to be very unusual. Essentially the sea slug hijacks the plastds (chloroplasts) from an alga and keeps them functioning for nearly a year. The plastids are the only parts of the alga maintained by the slug and once they are established, the slug no longer feeds. Not a big deal except for the fact that plastids need thousands of proteins that are encoded in the algal nucleus and targeted to function in the plastid. Where do those proteins come from when the algal nucleus is gone?

For years people have believed that the answer lies in the sea slug nucleus. If genes of algal origin were currently encoded in the sea slug nuclei, then the animal might be able to keep the plastids going. One complication is the need for a protein targeting system to get the slug-manufactured proteins to the plastids that need them. In their native alga, four membranes separate the plastid lumen from the cytoplasm. It is not clear how many membranes remain around the plastids once the sea slugs have ingested them, but a 5' "targeting signal" on the unfolded peptide and a number of chaperonins are typically required for it to arrive at the intended destination. Thus, it is not as easy as just acquiring a couple of novel genes.

A simplified diagram of gene transfer and targeting back to the plastid. (source)

In order to identify algal genes in the sea slug, Pierce et al. (2012) took a circuitous route. First they sequenced the genome and transcriptome of the alga the sea slug feeds on, Vaucheria litorea. This was done to produce an as-complete-as-possible algal protein set. Next, they sequenced the transcriptome of adult sea slugs with plastids, starved for 2 months. The idea here is to ensure that no algal contamination should be present, just the hijacked plastids and whatever the sea slug is using to keep them active. At this point, it was a simple comparison: are there any algal genes in the sea slug transcriptome?

The first thing they discovered is that the plastids are transcriptionally active. Transcripts from genes encoded in the plastid were identified, indicating that the plastid is making proteins. Why is this important? Because the plastid can't make proteins without some help from the nucleus. Plastids do not encode a complete set of genes for the machinery to carry out protein synthesis.

Next, they scanned the slug for transcripts that could not be traced back to the plastid genome and found 52 hits for algal genes among the slug transcriptome. Many of these (27) represented genes with functions related to photosynthesis and carbon fixation, with the remaining having unknown function. Interestingly, all of the putatively algal transcripts found in the sea slug transcriptome were in very low copy number and most did not overlap. This is in strong contrast to typical plastid-targeted transcripts, which can be some of the most abundant in actively photosynthesizing cells.

Additionally, many of the transcripts had a small number of nucleotide changes (1-6bp), when compared to the algal copy. Is this an indication that they are indeed encoded in the sea slug genome and their sequence is drifting from that of the alga? Perhaps.

After a several paragraph bashing of papers I previously covered here the authors conclude that the sea slug Elyssia chlorotica nucleus contains at least 60 genes that have been stolen from its algal prey. It uses these genes to supply proteins to the hijacked plastid for their continued function, albeit in low transcript abundance. The sea slug genome is underway and nearly finished, so the story may reveal yet further surprises.

Pierce, S., Fang, X., Schwartz, J., Jiang, X., Zhao, W., Curtis, N., Kocot, K., Yang, B., & Wang, J. (2012). Transcriptomic evidence for the expression of horizontally transferred algal nuclear genes in the photosynthetic sea slug, Elysia chlorotica. Molecular Biology and Evolution DOI: 10.1093/molbev/msr316

10 responses so far

Repost for World Oceans Day: Solar Sea Slugs

Jun 08 2012 Published by under [Biology&Environment]

Today is World Oceans Day, so I thought I would repost the closest thing I have to a post on ocean life. This story is one that continues to get more interesting. I updated it (link below) to include some recent work last year and have been trying to find the time to write about another paper on the topic, published in 2011. Each new paper on these organisms has made the story both more interesting and more confounding, which is the kind of thing that always keeps my attention.

Original image here.

(Update here)

Sea slugs are far more interesting than their name might imply. Aside from being beautiful, they have some unusual ways of making a living. In the case of a few unrelated species, they steal for a living.

A handful of sea slugs have found away to make the most of the algae they eat. As they take in the cellular contents of the algae, they are able to separate the components and isolate the light harvesting organelles, the plastids. Whereas the majority of the chewed algal is digested, the plastids are spread throughout the specialized digestive track of the slugs so that they form a layer all over the upper part of the slug, giving it a green color. Depending on the species of slug, the animal can then rely entirely on the plastids for weeks to months as the sole source of energy, making them the solar powered slugs.

Elysia chlorotica feed on algae and then steal the plastids to harness solar energy.

But here is where the story gets interesting. It is well documented that plastids have high protein turn-over, especially in their light harvesting complexes. The constant barrage of photons breaks the antenna proteins down and they need to be constantly replaced. Those proteins, however, are not produced by genes encoded on the genome of the plastid. Instead, they are nucleus-encoded and targeted to the plastid by the cell cytosol, thanks to a signature extension on the 5' end of the transcript. Therefore, the algal nucleus is essential for the continued maintenance of functional plastids. But the slugs sequester ONLY the plastids, no nuclei. How do the slugs keep the plastids going for months in the absence of the algal nuclei and the essential plastid proteins they produce?

In 2008, it appeared that there might be an answer. Rumpho et al (2008) looked at the genome of E. chlorotica and identified genes that appeared to be derived from the algal nucleus. Gene transfer from the nucleus of an algal to that of an animal for the purpose of allowing the slug to maintain its own plastids! Needless to say, this story got a lot of press. The only issue was, there were very few genes found, and none of the major antenna proteins one would expect must be there. But it was a PCR-based survey, not a genome sequence, so people reasoned that the genes were there, just not PCR friendly for whatever reason.

Not so fast, claim Wägele et al. in an article released yesterday in Molecular Biology and Evolution. Wägele et al. sequenced cDNA from two slug species made from RNA transcripts (genes being expressed by the animal) extracted from plastid-containing slugs that were kept without a food source, and were thus entirely dependent on the plastids for carbon. Based on the findings of Rumpho et al. (2008), the expectation would be that genes encoded in the slug nucleus that had been transfered there from the alga for the purpose of plastid maintenance would be highly expressed under these conditions. Afterall, the plastids are actively photosynthesizing for the slug and have to deal with the wear and tear of the job.

But contrary to expectation, Wägele et al. found NO evidence of the antenna proteins, the Calvin cycle enzymes or the small subunit of RuBisCO (which is absent from the plastid of the algae the collected slugs were feeding on). This means that the plastids have no back-up proteins - once a protein that can not be made by the plastid breaks down, that's it. Based on what we know about plastids, this should happen within a matter of days - without a constant stream of new proteins from the nucleus, the photosynthetic apparatus should cease to work.

But that is NOT what we observe in the sea slugs. They maintain functional plastids for MONTHS. One explanation is that transcript levels of these critical proteins were too low for detection, but this is an entirely unsatisfying conclusion because 1) Next generation sequencing was used to produce very deep coverage of the transcripts, and 2) the small subunit of RuBisCO alone, accounts for roughly 15% of all transcripts in young plant leaves. The probability of missing all of the transcript necessary for the plastids to survive is virtually nil.

So what is going on? The answer, I'm afraid, is elusive. What we see in nature can not be explained by what we know about the components of the system. The proteins are not being made by the slug and the plastids can not survive as long as they do without replacement proteins, or so our current knowledge would suggest. Something has to give to explain our observations and I'm am eager to see what it is. With the E. chlorotica genome soon to be completed, answers may be on the horizon... or not.

Rumpho ME, Worful JM, Lee J, Kannan K, Tyler MS, Bhattacharya D, Moustafa A, & Manhart JR (2008). Horizontal gene transfer of the algal nuclear gene psbO to the photosynthetic sea slug Elysia chlorotica. Proceedings of the National Academy of Sciences of the United States of America, 105 (46), 17867-71 PMID: 19004808

Wagele, H., Deusch, O., Handeler, K., Martin, R., Schmitt, V., Christa, G., Pinzger, B., Gould, S., Dagan, T., Klussmann-Kolb, A., & Martin, W. (2010). Transcriptomic evidence that longevity of acquired plastids in the photosynthetic slugs Elysia timida and Plakobrachus ocellatus does not entail lateral transfer of algal nuclear genes Molecular Biology and Evolution DOI: 10.1093/molbev/msq239

One response so far

What I learned at an NSF Bio preproposal panel

It was my intention last week to blog a bit about the NSF Bio Preproposals while I was in DC, but that just didn't happen. What did I learn? Well, I can tell you what happened on one panel, but I got the impression that there is some decent variability in the system right now. Below are a few observations.

- The Big Idea is no more important than before. In talking to people leading up to writing the preproposals, there was a lot of emphasis on selling your Big Idea. Theoretically there was going to be less emphasis on methods and more on potential. Well, that kinda turned out to be bullshit. They were essentially judged like mini-proposals.

Cut rate. Our target cut was supposed to be 80%, with 15% landing in the category of "High Priority Invite" and 5% in the "Low Priority Invite". In the end we were closer to 20% HPI and 15% LPI, with everything else landing in "Do Not Invite". How many from the LPI category will actually get invited was not clear.

Two flavors. A lot of good science went into the DNI category. If you get a panel summary you will know why: The DNIs with a summary were discussed, the rest were triaged.

Triage. Anything that got less than three ratings of "good" in the preproposal stage was not discussed. Slapped a boiler-plate panel summary on those and moved on. Roughly 25% of the initial pool went undiscussed.

- BI still matters. Thought you could short the Broader Impacts section just because it was a preproposal? Wrong. A crappy BI section bumped several proposals to DNI.

- Small proposals get killed. For a long time there has always been the party line at NSF that there was no reason for a small grant mechanism because you could always send in a small proposal. Well, guess what happens when you remove the budget and measure all proposals with the same stick? Yeah.

- NSF is worried about new PIs too. Much was made of the concern for the N00bs, and NSF is watching this closely. If new PIs get disproportionately whacked, there will be a correction (This goes for RUI as well). BTW, current average to grant is three years.

- Possible funding from preproposals. One possibility that came up was that the very top preproposals might, in future years, just get funded without a full proposal. Things are still in flux, but my panel felt it could make some awards now.

- Incorporation of preproposal panel members for the full proposals is being discussed. There was some concern by panelists over having two very different hoops for the PIs to jump through, between the preproposals and the full proposals. One way to keep some continuity could be if preproposal panelists agreed to serve as ad hocs for the proposals that got invited and for which they were the primary reviewer in the first round. Everyone started with eight primaries, but I don't think anyone had more than two or three make it to the invite stage, which would be a manageable load to ad hoc.

- Shrinking everything. The full proposal may be going on a diet soon, too.

- Three's often enough. Most proposals got a pretty fair treatment with three people having read them and no ad hocs. There may have been some that could have benefited or been hurt, but overall a panel-only review didn't seem to be an issue.

- The long month. The time line to notifications is roughly a month for those getting an invite, longer for those that got bumped.

So, do I think this is going to be an effective process? It's too early to tell. Just based on my preferences as a PI, I think I would prefer something more along the lines of the 8 month cycle that MCB has gone to, so that you can more effectively manage one's grant load. As it stands now, anyone who can apply to DEB and IOS could have 4 preproposals going in every January. The 8 month cycle with 1 proposal per round would spare people the year between submissions and spread the load out a bit.

But we will have to see how it all goes. One thing was clear: no one, at NSF or as PIs, really know how this is all going to play out.

41 responses so far

NSF preproposal update

For those of you NSFers out there, I thought I would give you an update on what I know so far about the preproposal process:

- A while back I mentioned that I expected roughly 3X the normal load of proposals, based on the number of names on the Conflict of Interest document. Surprisingly, I'm seeing MORE co-PI proposals than before, suggesting that the limit of two proposals for every PI/co-PI is not much of a factor. More people seem to be spreading their effort around, resulting in several proposals in my batch with >4 co-PIs.

- As for straight numbers, I am reviewing a little less than 2X the number of preproposals than regular proposals I received last time. Given that they are significantly shorter I am spending less time reading but we still have to write a review for each one.

- The review at this stage is similar, but we are being asked to triage 80% of the preproposals. POs want 15% in the "high priority invite" category and 5% in the "low priority invite" category when all is said and done. Therefore, we are being instructed to be more aggressive in our reviews, to whittle down the numbers that will eventually be discussed. In the past, every proposal has at least been discussed at panel. Those days appear to be over and triaged proposals will not be getting detailed panel summaries back.

- In practical terms, preproposals getting ratings of Good, Very Good and Excellent have a shot at being discussed. Anything that gets a Fair or Poor hits the round file.

That's what I know so far, more to come.

7 responses so far

A semester of women in science

Here's a question I have been pondering this evening and I would like some help getting and answer: How many departments can we convince to field an all female invited speaker slate for the fall semester of 2012?

This was spurred by a twitter conversation with @phylogenomics and @duffy_ma that began with Jonathan Eisen suggesting that he wanted to start a conference with all women presenters.

In my limited scientific capacity I have tried a version of this experiment. I organize my department's seminar series, and a year ago I decided I was going to invite only women and not tell anyone in my department. I was nearly able to fill the semester with all female speakers, but a few visiting scientists ruined my sweep. However, not a single person in the department ever appeared to notice.

But I would like to take this a step further. For the fall of 2012 I would like to not only invite a full slate of female scientists as speakers, but I plan to get my colleagues on board with the plan. Beyond that, I want to get you, dear readers, on board as well. Between you and I, let's see just how many departments we can convince to invite only women to give talks in the fall of 2012.

I plan to bring it up in faculty meeting. I'll inform my department that I'm doing this and ask for suggestions to be sent. By doing it publicly, anyone who scoffs will be announcing that they are an asshole. It may already be widely known, but it never hurts to let these people self identify. By presenting my plan as a done deal, my guess is that people will buy in. Everyone likes something a little different and also likes someone else to do the work.

At the same time, I'm not going to make a big deal of it! No announcements, no drawing attention so that people will say "Oh, look. Affirmative action!" because you and I know that such a sentiment will only make people put a mental asterisk next to the speaker list, which is bullshit. In fact, I have been trying to get three female speakers in for two years and they have been too busy traveling. I'm not looking for any reason for people to view the speaker list as anything different, just like no one would bat an eye if it was an all white sausage party.

I like this evening idea for lots of reasons, but 1) I'm hoping it will help me convince a couple of my favorite scientists to make the visit in support of the notion, and 2) I want my colleagues to at least get it in the back of their mind that we generally invite too many old white dudes.

So help me out people. Who else thinks they can convince their department to spend a semester listing to some awesome science done by kick ass women?

34 responses so far

Poll: Field of danger

The semester is already kicking my ass, but I have a quick hypothetical situation for your ponderation this morning.

The situation involves field work for your project, whether you be a PI with funding for an individual project, a grad student who has secured some funds to complete a project, or whatever. Anyone who has done field work knows that there is always a nature component to the work and my question today revolves around how much that controls what you do.

You've arrived at your field site and the conditions are not ideal. Weather has kicked up that makes the work you are planning a little more dangerous than the expected conditions. While you would not be prevented from doing the work, the added elements give you pause. You've spent the budgeted funds available to make the trip and the samples/data are critical to the work you proposed. Because your work is seasonally dependent, you would have to wait until next year to collect these data if you opt out of the work this time, assuming you could scrape the funds together to make the trip again.

What do you do?

19 responses so far

« Newer posts Older posts »