I'd like to extend a huge science librarian blogosphere welcome to Information Culture, the newest blog over at Scientific American Blogs!
This past Sunday evening I got a cryptic DM from a certain Bora Zivkovic letting me know that I should watch the SciAm blog site first thing Monday morning. I was busy that morning but as soon as I got our of my meeting I rushed to Twitter and the Internet and lo! and behold!
I'm always happy to see librarians invading faculty and researcher blogs networks and this is no exception.
What's even happier is that one of the bloggers at the new site is Bonnie Swoger, long-time blogger at Undergraduate Science Librarian. Bonnie is a super blogger and a terrific colleague who I'm always glad to see at Science Online. I'm sort of wondering what's taken so long for a blogging network to snap her up and I guess it's not surprising that Bora's the one to finally get it done.
Joining Bonnie is an equally wonderful but new-to-me blogger, Hadas Shema. Hadas is an Info Sci grad student at Bar-Ilan University in Israel and formerly blogged at Science blogging in theory and practice.
Here's what Bora has to say in his Introductory post: Welcome Information Culture - the newest blog at #SciAmBlogs
How to do an efficient search? How can a librarian help you find obscure references? What is this "Open Access" thing all about? Why is there a gender gap among Wikipedia editors? How do science bloggers link to each other? Can tweeting a link to a paper predict its future citations? How to track down an un-linked paper mentioned in a media article? What is going on with eTextbooks?
And from the new blog itself, a taste of the first three posts:
Introduction post - Hadas Shema
Two questions I get asked now and then are A. "What do you study?" And B. "What is it good for? (as in "Why should my tax money fund you?"). Now that I have an excellent platform like this SciAm blog, I might as well take advantage of it to answer at least the first question (I'll let you decide if it's worth the taxpayer's money).
I study Information or Library Science, and my sub-field is what used to be called Bibliometrics, "the application of mathematical and statistical methods to books and other media of communication," (Pritchard, 1969). The term was invented back in 69â², when official scientific communication involved dead trees. The Russian version, "Scientometrics" was coined around that time as well. Today we have a variety of other terms, perhaps more appropriate for the net age: Cybermetrics, Informetrics, Webometrics and even Altmetrics. But for now, let's stick with Bibliometrics.
Bibliometricians measure, analyze and record scientific discourse. We want to learn what impact scientific articles, journals, and even individual scientists have on the world. Until recently "the world" meant "other articles, journals and individual scientists" because it was next to impossible to research the way scientific discourse affect the rest of the world, or even how scientists affect it when they're not in "official" capacity (publishing a paper or speaking at a conference). Now Bibliometricians not only need a new name, but new indices. That's what I (and plenty of other people) work on. We ask what scientists are doing on the Web, how and why they're doing it and the most important thing - can we use it to evaluate the impact of their work.
You have to share (by Bonnie Swoger)
Understanding how scientists share their results is my job. I am a science librarian.
I work with scientists at my college to make sure that they have access to the information they need to do their work. I teach undergraduates - novice scientists - how the scientific literature works: What kinds of information are available? Where can you find what you need? How can you use the different types of information? I work with researchers to help them understand new developments in scholarly communication: What is a DOI and how can it make your research just a bit easier? Are you allowed to post a copy of your recent article on your website and what are the advantages if you do?
And as I work with students and faculty at my institution, this blog will be a place for me to share some of these concepts with you. I'll share tips to help you find information faster, explain basic concepts related to the publication of scientific results and try to figure out how recent scholarly comunication news
*snip*
It's hard to stand on the shoulders of giants if the giants are hiding under the bed.
Understanding the Journal Impact Factor - Part One (by Hadas Shema)
The journals in which scientists publish can make or break their career. A scientist must publish in "leading" journals, with high Journal Impact Factor (JIF), (you can see it presented proudly on high-impact journals' websites). The JIF has gone popular partly because it gives an "objective" measure of a journal's quality and partly because it's a neat little number which is relatively easy to understand. It's widely used by academic librarians, authors, readers and promotion committees.
Raw citation counts emerged at the 20â²s of the previous century and were used mainly by science librarians who wanted to save money and shelf space by discovering which journals make the best investment in each field. This method had a modest success, but it didn't gain much momentum until the sixties. That could be because said librarians had to count citations by hand.
Run on over and say Hi to Bonnie and Hadas!
- Log in to post comments
Thanks, John. Hopefully we'll keep it interesting :)
Amazing.The latest blog of Scientific American is very effective..
Thanks