[Today’s live-blog is brought to you by Yvonne Nobis, science librarian at Cambridge, UK. Thanks, Yvonne! — Mike.]

Session 1 — The Journal Article: is the end in sight?

Slightly late start due to trains – !

Just arrived to hear Aileen Fyfe University of St Andrews saying that something similar to journal articles will be needed for ‘quite some time’.

Steven Hall, IOP.

The article still fulfils its primary role — the registration, dissemination, certification and archiving of scholarly information. The Journal Article still provides a fixed point — and researchers still see the article as a critical part of research — although it is now evolving into something much more fluid.

Steve then outlined some of the initiatives that IOP have implemented. Examples include the development of thesauri — every article is ‘semantically fingerprinted’. No particular claims are made for IOP innovation — some are broad industry initiatives — but demonstrate how the journal article has evolved.

(Personal bias: as a librarian I like the IOP journal and ebook offering!) IOP have worked with RIN on a study on the researcher behaviour of physical sciences — to research the impact of new technology on researchers. Primary conclusion: researchers in the physical sciences are conservative and oddly see the journal article as most important method of communicating research. (This seems at odds with use of arXiv?)

Discussion

Mike Brady discusses the ‘floribunda’ of the 19th century scholarly publishing environment.

Sally Shuttleworth (Oxford) questions the move from the gentleman scholar to the publishing machinery of the 21st century and wonders if there will be a resurgence due to citizen science?

Tim Smith (CERN) proposes that change is being technologically driven.

Stuart Taylor (Royal Society publishing) agrees with Steve that there is disconnect between reality and outlandish speculations about what should be in place, and the ‘bells and whistles’ that publishers are adding in to the mix that are not used.

Cameron Neylon: what the web gives us the ability to separate content from display — and this gives us a huge opportunity — and many of us in the this room did predict the death of the article several years ago …(This was premature!)

Herman Hauser makes the valid point that it is well nigh impossible for a researcher now to understand the breadth of a whole field.

Ginny Barbour raises the question of incentives (the article still being the accepted de facto standard). The point was also raised that perhaps this meeting should be repeated with an audience 30 years younger…

No panel comment on this point, however I fear what many would say is that this meeting represents the apex of a pyramid, where these discussions have occurred for years in other conferences (for example, the various science online and force meetings) and have driven both innovation (novel publishing models) and the creation of tools.

I asked about (predictably enough) about use of arXiv — slightly surprised at the response to the RIN study.

Steve Hall: ‘science publishers are service providers’ — if scientific communities become clear about what they want, we can provide such services — but coherent thinking needs to underwrite this. Steve also questions the incentives put in place for researchers to publish in certain high impact journals and how this is damaging.

David Coloquhan raises the issues of perverse incentives for judging researchers, including altmetrics.

Steve Hall: arXiv won’t allow publishers on their governing bodies –and interestingly librarians (take note!) should be engaging with the storage of the data!

Aileen, in conclusion, questions how did the plurality of modes of communication we had in the 18th and 19th centuries get closed down to the level of purely journals? The issue of learned societies and their relationship with commercial agencies is often a cause for concern…

Session 2 How might scientists communicate in the future?

Mike Brady

the role of the speakers is to catalyse discussion amongst ourselves…

Anita de Waard (Elsevier)

350 years ago science was an individual enterprise, although now many large collaborations, much scientific discussion is still on a peer to peer level.

How do we unify the needs of the collective and individual scientists?

We need to create the systems of knowledge management that work for scientists, publishers and librarians.

Quotes John Perry Barlow: ‘Let us endeavour to build systems that allow a kid in Mali who wants to learn about proteomics to not be overwhelmed by the irrelevant and the untrue’ (It would be cruel to mention various issues with the Journal of Proteomics last year…)

Problem is the the paper is the overarching modus operandi. Citations to data are often citations to pictures. We need better ways of citing and connecting knowledge. ‘Papers are stories that persuade with data’, says Anita. She argues we need better ways of citing claims, and constructing chains of evidence that can be traced to their source.

For this we need tools and to build habits of citing evidence into all aspects of our educational system (starting at kindergarten)!

Another problem is data cannot be found or integrated (this to my view is something that the academic community should be tackling, not out-sourcing, which is the way I see this going…)

An understanding needs to evolve that science is a collective endeavour.

Anita is now covering scientific software (‘scientific software sucks’ is the quote attributed to Ben Goldacre yesterday) — it compares unfavourably to Amazon … not sure how true this is?

Anita is very dismissive of scientific software not being adequate — often code is written for a particular purpose. (My view is that this is not something that can easily be commercially outsourced — High energy physics anyone?)

Mark Hahnel, FigShare

(FigShare was built as a way for Mark to curate/publish his own research.)

Mark opens with policies from different funders (at Cambridge we are feeling the effect of these already) for data mandates — especially EPSRC: all digital outputs from funded research now must be made available.

Mark talks around the Open Academic Tidal Wave — sorry not a great link but the only one I can find (thanks Lou Woodley): and we are at level 4 of this.

Mark surveyed publishers about what they see the future of publishing in 2020 — and they replied ‘Version control on papers, data incorporated within the article’, but the technology is there already — and uses the example of F1000 Research.

Discussion

Mike Brady: It’s as well Imelda Marcos was not a scientist — following on from Anita’s claims that software for buying shoes is more fit for purpose than scientific software!

Herman Hauser: willing to fund things that help with an ‘evidence engine’ to avoid repeats of the MMR fiasco!

David Coloquhan: science is not the same as buying shoes! Refreshingly cynical.

Wendy Hall stresses the importance of linking information — every publisher should have a semantically linked website (and on the science of buying shoes).

Comment from the floor: Getting more data into repositories may not be exciting but is essential. Mark agrees — once the data is there you can do things with it, such as building apps to extract what you need.

Richard Sever (Cold Harbour Press) with a great quote: “The best way to store genomic data is in DNA.”

Mike Taylor: when we discuss how data is associated with papers we must ensure that this is ‘open’, this includes the APIs, to avoid repeating the ‘walled garden of silos’ in which we find ourselves now.

Question of electronic access in the future (Dave Garner) — how do we future-proof science? Very valid — we can’t access material from 1980s floppy disks!

Anita: data is entwined with software and we need to preserve these executable components. Issues returning to citation and data citations and incentives again which has been a pervasive theme over the last couple of days.

Cameron Neylon: we need to move to a situation where we can publish data itself, and this can be an incremental process, not the current binary ‘publish or not publish’ situation (which of course comes back to incentives).

In summary, Mark questions timescales, and Anita wonders how the Royal Society can bring these topics to the world?

Time for lunch, and now over to Matthew Dovey to continue this afternoon (alongside Steven Hall another of my former colleagues)!

Advertisements

I’ll try to live-blog the first day of part 2 of the Royal Society’s Future of Scholarly Scientific Communication meeting, as I did for the first day of part 1. We’ll see how it goes.

Here’s the schedule for today and tomorrow.

Session 1: the reproducibility problem

Chair: Alex Halliday, vice-president of the Royal Society

Introduction to reproducibility. What it means, how to achieve it, what role funding organisations and publishers might play.

For an introduction/overview, see #FSSC – The role of openness and publishers in reproducible research.

Michele Dougherty, planetary scientist

It’s very humbling being at this meeting, when it’s so full of people who have done astonishing things. For example, Dougherty discovered an atmosphere around one of Saturn’s moons by an innovative use of magnetic field data. So many awesome people.

Her work is largely to do with very long-term project involving planetary probes, e.g. the Cassini-Huygens probe. It’s going to be interesting to know what can be said about reproducibility of experiments that take decades and cost billions.

“The best science output you can obtain is as a result of collaboration with lots of different teams.”

Application of reproducibility here is about making the data from the probes available to the scientific community — and the general public — so that the result of analysis can be reproduced. So not experimental replication.

Such data often has a proprietary period (essentially an embargo) before its public release, partly because it’s taken 20 years to obtain and the team that did this should get the first crack at it. But it all has to be made publicly available.

Dorothy Bishop, chair of Academy of Medical Sciences group on replicability

The Royal Society is very much not the first to be talking about replicability — these discussions have been going on for years.

About 50% of studies in Bishop’s field are capable of replication. Numbers are even worse in some fields. Replication of drug trials are particularly important, as false result kill people.

Journals cause awful problems with impact-chasing: e.g. high-impact journals will publish sexy-looking autism studies with tiny samples, which no reputable medical journal would publish.

Statistical illiteracy is very widespread. Authors can give the impression of being statistically aware but in a superficial way.

Too much HARKing going on (Hypothesising After Results Known — searching a dataset for anything that looks statistically significant in the shallow p < 0.05 sense.)

“It’s just assumed that people doing research, know what they are doing. Often that’s just not the case.”

many more criticisms of how the journal system encourages bad research. They’re coming much faster than I can type them. This is a storming talk, I wish the record would be made available.

Employers are also to blame for prioritising expensive research proposals (= large grants) over good ones.

All of this causes non-replicable science.

Floor discussion

Lots of great stuff here that I just can’t capture, sorry. Best follow the tweet stream for the fast-moving stuff.

One highlight: Pat Brown thinks it’s not necessarily a problem if lots of statistically underpowered studies are performed, so long as they’re recognised as such. Dorothy Bishop politely but emphatically disagrees: they waste resources, and produce results that are not merely useless but actively wrong and harmful.

David Colhoun comments from the floor: while physical sciences consider “significant results” to be five sigmas (p < 0.000001), biomed is satisfied with slightly less than two sigmas (p < 0.05) which really should be interpreted only as “worth another look”.

Dorothy Bishop on publishing data, and authors’ reluctance to do so: “It should be accepted as a cultural norm that mistakes in data do happen, rather than shaming people who make data open.”

Coffee break

Nothing to report :-)

Session 2: what can be done to improve reproducibility?

Iain Hrynaszkiewicz, head of data, Nature

In an analysis of retractions of papers in PubMed Central, 2/3 were due to fraud and 20% due to error.

Access to methods and data is a prerequisite for replicability.

Pre-registration, sharing of data, reporting guidelines all help.

“Open access is important, but it’s only part of the solution. Openness is a means to an end.”

Hrynaszkiewicz says text-miners are a small minority of researchers. [That is true now, but I and others are confident this will change rapidly as the legal and technical barriers are removed: it has to, since automated reading is the only real solution to the problem of keeping up with an exponentially growing literature. — Ed.]

Floor discussion

I’m at the Royal Society today and tomorrow as part of the Future of Scholarly Scientific Communication conference. Here’s the programme.

I’m making some notes for my own benefit, and I thought I might as well do them in the form of a blog-post, which I will continuously update, in case anyone else is interested.

I stupidly didn’t make notes on the first two speakers, but let’s pick up from the third:

Deborah Shorley, ex-librarian of Imperial College London

Started out by saying that she feels her opinion, as a librarian, is irrelevant, because librarians are becoming irrelevant. A pretty incendiary opening!

Important observations:

“Scientific communication in itself doesn’t matter; what matters is that good science be communicated well.”

And regarding the model of giving papers to publishers gratis, then paying them for the privilege of reading them:

“I can’t think of any other area where such a dopey business model pertains.”

(On which, see Scott Aaronson’s brilliant take on this in his review of The Access Principle — the article that first woke me up to the importance of open access.)

Shorey wants to bring publishing skills back in-house, to the universities and their libraries, and do it all themselves. As far as I can make out, she simply sees no need for specialist publishers. (Note: I do not necessarily endorse all these views.)

“If we don’t seize the opportunity, market forces will prevail. And market forces in this case are not pretty.”

Robert Parker, ex-head of publishing, Royal Society of Chemistry

Feels that society publishers allowed themselves to be overtaken by commercial publishers. Notes that when he started working for the RSC’s publishing arm, it was “positively dickensian”, using technology that would mostly have been familiar to Gutenberg. Failure to engage with authors and with technology allowed the commercial publishers to get ahead — something that is only now being redressed.

He’s talking an awful lot about the impact factors of their various journals.

My overall impression is that his perspective is much less radical than that of Deborah Shorley, wanting learned-society publishers to be better able to compete with the commercial publishers.

Gary Evoniuk, policy director at Glaxo Smith Klein

GSK submits 300-400 scientific studies for publication each year.

Although the rise of online-only journals means there is no good reason to not publish any finding, they still find that negative results are harder to get published.

“The paper journal, and the paper article, will soon be dead. This makes me a little bit sad.”

He goes further and wonders whether we need journal articles at all? When actual results are often available long before the article, is the context and interpretation that it provides valuable enough to be worth all the effort that’s expended on it? [My answer: yes — Ed.]

Discussion now follows. I probably won’t attempt to blog it (not least because I will want to participate). Better check out the twitter stream.

Nigel Shadbolt, Open Data Institute

Begin by reflecting on a meeting ten years ago, convened at Southampton by Stevan Harnad, on … the future of scholarly scientific communication.

Still optimistic about the Semantic Web, as I guess we more or less have to be. [At least, about many separate small-sw semantic webs — Ed.] We’re starting to see regular search-engines like Google taking advantage of available machine-readable data to return better results.

Archiving data is important, of course; but it’s also going to be increasingly important to archive algorithms. github is a useful prototype of this.

David Lambert, president/CEO, internet2

Given how the digital revolution has transformed so many fields (shopping, auctions, newspapers, movies) why has scholarly communication been so slow to follow? [Because the incumbents with a vested interesting in keeping things as they are have disproportionate influence due to their monopoly ownership of content and brands — Ed.]

Current publication models are not good at handling data. So we have to build a new model to handle data. In which case, why not build a new model to handle everything?

New “born-digital” researchers are influenced by the models of social networks: that is going to push them towards SN-like approaches of communicating more stuff, more often, in smaller unit. This is going to affect how scholarly communication is done.

Along with this goes an increasing level of comfort with collaboration. [I’m not sure I see that — Ed.]

Bonus section: tweets from Stephen Curry

He posted these during the previous talk. Very important:

Ritu Dhand, Nature

[A disappointing and unconvincing apologia for the continuing existence and importance of traditional publishers, and especially Nature. You would think that they, and they alone, guard the gates of academia from the barbarians. *sigh*. — Ed.]

Lunch

Georgina Mace, UCL

[A defence of classical peer-review. Largely an overview of how peer-review is supposed to work.]

“It’s not perfect, it has its challenges, but it’s not broken yet.”

Richard Smith, ex-editor of BMJ

[An attack on classical peer-review.]

“Peer review is faith-, not evidence-based; ineffective; a lottery; slow; expensive; wasteful; ineffective; easily abused; biased; doesn’t detect fraud; irrelevant.

Apart from that, it’s perfect.”

He doesn’t want to reform peer-review, he wants to get rid of it. Publish, let the world decide. That’s the real peer-review.

He cites studies supporting his assertions. Cochrane review concluded there is no evidence that peer-review is effective. The Ioannidis paper shows that most published findings are false.

Someone should be recording this talk. It’s solid gold.

Annual cost of peer-review is $1.9 billion.

[There is much, much more. I can’t get it down quickly enough.]

 Georgina Mace’s rebuttal

… amounts to contradicting Richard Smith’s evidence-supported statements, but she provides no evidence in support of her position.

Richard Smith’s counter-counter rebuttal

… cites a bunch more studies. This is solid. Solid.

For those who missed out, see Smith’s equally brutal paper Classical peer review: an empty gun. I find his conclusion (that we should just dump peer-review) emotionally hard to accept, but extremely compelling based on actual, you know, evidence.

Fascinating to hear the level of denial in the room. People really, really want to keep believing in peer-review, in spite of evidence. I understand that impulse, but I think it’s unbecoming in scientists.

The challenge for peer-review advocates is: produce evidence that it has value. No-one has responded to that.

Richard Sever, Cold Spring Harbour Press

Richard presents the BiorXive preprint server. Turns out it’s pronounced “bio-archive”, not “bye-orx-ive”.

Nothing in this talk will be new to regular SV-POW! readers, but he makes good, compelling points in favour of preprinting (which we of course agree with!)

Elizabeth Marincola, CEO, PLOS

PLOS is taking steps towards improving peer-review:

  • Use of article-level metrics
  • Moves towards open review
  • Move toward papers evolving over time, not being frozen at the point of publication
  • Better recognition of different kinds of contribution to papers
  • Intention to make submitted paper available to view before peer-review has been carried out, subject only to checks on ethical and technical standard: they aim to make papers available in “a matter of days”.

She notes that much of this is not original: elements of these approaches are in F1000 Research, BiorXiv, etc.

Jan Velterop, science publisher with everyone at some point.

“I’m basically with Richard Smith when it comes to abolishing peer review, but I have a feeling it won’t happen in the next few weeks.”

The situation of publishers:

“Academia throws money at you. What do you do? You pick it up.”

Velterop gets a BIG laugh for this:

“Does peer-review benefit science? I think it does; and it also benefits many other journals.”

He quotes a Scholarly Kitchen blog-post[citation needed] as saying that the cost of technical preparation at PubMed Central — translating from an MS-Word manuscript to valid JATS XML — at $47. So why do we pay $3000 APCs? Surely the peer-review phase doesn’t cost $2953?

Update: here is that Scholarly Kitchen article.

Velterop’s plan is to streamline the review-and-publish process as follows:

  • Author writes manuscript.
  • She solicits reviews from two experts, using her own knowledge of the field to determine who is suitably skilled.
  • They eventually sign off (perhaps after multiple rounds of revisions)
  • The author submits the manuscript, along with the endorsements.
  • The editor checks with the endorsers that they really have given endorsement.
  • The article is posted.

Bam, done!

And at that point in the proceedings, my battery was running dangerously low. I typed a tweet: “low battery may finally force me to shut up! #RSSC”, but literally between typing at and hitting the Tweet button, my laptop shut down. So that’s it for day 1. I’ll do a separate post for the second and final day.

There are probably many ways of getting a “90% complete” paper finished and ready for submission, but here’s the way that works for me. (It’s working for me right now: I’m in the middle of the process, and broke off to write this just for a a break.)

You will need:

  • A printed copy of your manuscript
  • A red pen
  • A CD of Dar Williams songs that you know inside out
  • A bottle of red wine
  • A bar of white chocolate (optional)

Method:

Take the printed copy of the manuscript. read it through, with the Dar Williams CD on in the background. Every time you see anything you don’t like, scribble on the printed copy with the red pen. It might a typo, a misspelling, an infelicitous phrasing, a missing reference, a taxonomic name needing italics; or it might be something bigger, like two sections that need to be swapped.

Do you really need a printed copy for this? YES YOU DO! Can’t you just do it on the screen? NO YOU CAN’T! For one thing, you’ll keep breaking off to read email, which is a complete killer. For another, you’ve been working on this manuscript on screens for months already. Your poor brain is inoculated against its on-screen appearance. You need the mental jolt that a shift of format gives you. And you need the freedom to scribble. When I do this, I often write in suggestions to myself of what alternative wording to use, but I feel free to ignore them when I come to make the edits.

Do you really need a Dar Williams CD? I am prepared to concede it doesn’t necessarily have to be Dar Williams. But it does need to be something that you know so well that it won’t surprise you, it won’t grab your attention away from the work you’re doing. Much as I love Dream Theater, their music is really not the way to go for this. What you want is music that will keep feeding you without distracting you.

Do you really need the red wine and the white chocolate? Perhaps not, but you don’t want this to be a boring, unpleasant process, do you? Treat yourself. (DISCLOSURE: I have moved on to beer.)

What next?

As soon as I’m done posting this, I’ll be going to Step 2, which is to go through the manuscript, making edits on the master copy. Most of them are trivial to do. A few are going to need real work. For these, I just leave a marker in the master copy, “###” and a note saying what needs doing. I will later search for these and do the work. But not tonight.

The goal of this process is to capture all the information that you wrote on the printed copy, so that you can throw it away and move on with your life.

That’s it — it’s all you need to do. For the record, I expect to submit in the  next three or four days.

Trying two new things this morning: grilling a turkey, and live-blogging on SV-POW!

I like to grill. Steak, chicken, kebabs, yams, pineapple, bananas–as long as it’s an edible solid, I’m up for it. But I’ve never grilled a turkey before. Neighbor, colleague, fellow paleontologist and grillmeister Brian Kraatz sent me his recipe, which is also posted on Facebook for the edification of the masses. See Brian’s excellent writeup for the whole process, I’m just going to hit the photogenic parts here. Oh, and usually I tweak any photos I post within an inch of their lives, but I don’t have time for that this morning, so you’re getting as close to a live, unedited feed as I can manage. Stay tuned for updates.

Enough of that. Let’s rock!

The process starts  more than a day in advance, with the brine. Salt water, fruit, onions, garlic, spices, and some apple juice.

The turkey needs to be entirely immersed in the brine for at least 24 hours. Doing this in a solid container would require an extra big container and too much  liquid to cover the bird. I follow Brian’s method of brining in a triple-layer of trash bags. You can see a turkey roaster peeking out underneath the trash bags. Helps with the carrying.

Put the turkey in the trash bags first, then pour in the brine. Unless you like huge messes.

The genius of the trash bag method on display. You can squeeze out all the air so that the volume of the bag is equal to just the turkey and the brine.

Into the fridge for a day.

First thing this morning: out come the giblets, and save the goodies from the brine. We’ll get back to the neck later.

The bird awaits.

Crucial step: putting in a drip pan. Keeps the coals off to the side for indirect heat, and catches the grease so you don’t burn down the neighborhood.

Putting in the herb butter. I used three short sticks of butter mixed with sage, lemon pepper, and Mrs. Dash. Working the skin away from the meat and then filling the space with butter was extremely nasty. This must be what diverticula feel like.

A chimney is helpful to get the coals going.

To eat is human; to grill is divine.

Smoke bombs: mesquite chips soaked in water, wrapped up in balls of tinfoil, with holes poked on top to let the smoke out.

Fruit and spices into the body cavity.

At this point, I was fairly certain that today would be the greatest day of my life. The turkey is centered over the drip pan, stuffed with goodness, subcutaneously loaded with herb butter, draped with bacon. You can see one of the smoke bombs sitting right on top of the coals.

Know what you’re getting into. This 15 lb bird just barely cleared the lid of my grill.

A little over an hour in. I installed foil heat shields to keep the wings and thighs from cooking too fast. It’s all about the indirect heat. Some of the bacon comes off now, as a mid-morning treat.

Okay, the bird is about halfway done, and I have to whip up some sustainer coals and another batch of smoke bombs. Further updates as and when. Happy Thanksgiving!

UPDATE

I was hoping to get some more pictures posted before we ate, but you know how it is in the kitchen on Thanksgiving Day (or, if you’re not an American, maybe you don’t know, so I’ll tell you: dogs and cats living together, we’re talking total chaos).

The turkey just before I pulled it off the grill. The heat shields turned out to be clutch, I would have completely destroyed the limbs without them. That’s going to be SOP from now on.

Ah yes, the bird, she turned out even more succulent than I hadda expected. Check out the pink shade of the meat just below the skin. I recognize that, from good barbeque, but I’ve never produced it before.

That’s it for the cooking part of today’s program. As for the ultimate fate of the bird…we ate a stupifying amount of it. I sent even more home with our guests. And the other half–yes, half–of this thunder beast is sitting in the fridge. Hello-o leftovers!

And hello-o science!

I was going to post some more pictures of the neck, but I didn’t get around to eating it, so…another time, perhaps. In lieu, here’s Mike’s turkey vertebra in left lateral view (see the original in all its supersized glory here). Note the pneumatic foramen in the lateral wall of the centrum, just behind the cervical rib loop. This is actually kind of a lucky catch; a lot of times with chickens and turkeys, the pneumatic foramina are so far up in the cervical rib loop that they can’t be seen in lateral view.

It used to freak me out a little bit that birds often don’t have their pneumatic foramina in the middle of the lateral wall of the centrum, like sauropods. But a possible explanation occurred to me just this morning as I was planning this post. I think that birds have their pneumatic foramina right where you’d expect them, based on sauropods. I’ll explain why.

The first part of the explanation is that instead of wearing their pneumatic cavities on the outside, like this Giraffatitan cervical, bird vertebrae tend to be inflated from within, with just a few tiny foramina outside. The second part is that birds have HUGE cervical rib loops compared to sauropods. If the sauropod vert shown above had its rib on, the resulting loop would be fairly dainty, the osteological equivalent of a bracelet. The cervical rib loops of birds are more like tubes, they’re so antero-posteriorly elongated.

So take the brachiosaur cervical shown above and shrink all of the external pneumatic spaces by several inches. The cavities on the arch and spine would close up entirely, and the complex of fossae and foramina on the lateral side of the centrum would be reduced to a small hole right behind the cervical rib. Then stretch out the cervical rib loop in the fore-aft direction and voila, you’d have something like a turkey cervical, with a little tiny pneumatic foramen tucked up inside the cervical rib loop.

This doesn’t explain why bird verts are inflated from within instead of being eroded from without, or why sauropods had such dinky cervical rib loops (mechanical what, now?), or why pneumatic diverticula tend to make the biggest holes in the front half of the centrum, adjacent to the cervical ribs. I just think that maybe bird and sauropod pneumaticity are not as different as they  appear at first glance. Your thoughts are welcome.