During the summer, we are keeping summer hours. Posts will be shared bi-weekly and breaks will be taken for the 4th of July, the last two weeks of August, and Labor Day!
Too often, the beauty industry uses pseudoscience to promote its products. Even as a trained scientist, it's difficult to tell the difference between fact, embellishment, and downright fiction.
Further complicating matters, cosmetics do not need FDA approval. Cosmetic regulation by the FDA is minimal, and the laws governing its practice have not changed since 1938. The FDA's power over the cosmetic industry is limited to removing products from the market if they are "adulterated" or "misbranded." In other words, a product can be nixed if it contains a poisonous or spoiled ingredient or if the label provides misleading information.
With the lack of cosmetic regulations, consumers have to ask, "Does this product even work?" Can you trust that charcoal will cleanse your blackheads, that hyaluronic acid will plump your under-eyes, or that collagen will increase the elasticity of your skin?
Today's article will be the beginning of a series of posts on the science of common cosmetic additives. To begin with, we will explore the biology and chemistry of one of skin care's most prevalent ingredients, collagen.
What is Collagen?
Collagen is a natural biomolecule produced by animals, acting as structural support for cells within our connective tissues such as skin, bone, tendons, and cartilage. There are 28 subtypes of collagen, all with similar function.
Collagen is a protein, meaning it's made up of a chain of amino acids folded into a unique shape. Specifically, collagen is made of 3 amino acid chains, twisted together to form a long, helical stretchy protein:
Collagen in skincare
Many biological mechanisms contribute to skin aging, including a decreased production and a deterioration of collagen networks. Intuitively, collagen supplements may alleviate wrinkles and loss of elasticity. But is there scientific evidence to support that collagen is an effective anti-aging additive? Or is an untested theory the driving force of collagen sales?
Quite a few studies support that hydrolyzed collagen has anti-aging effects when used topically. For example, in a 2019 study, participants given topical collagen had increased skin hydration and elasticity in just 28 days, while wrinkles improved after 90 days of treatment. A more recent study also reports topical collagen as an effective cosmetic. Participants who applied a gel containing 1% collagen hydrosolate extracted from chicken stomachs exhibited increased skin hydration and elasticity and decreased wrinkles and roughness. Thus, the collagen flooding the skincare market is, indeed, backed by science!
Side bar: I was inspired to write this blog post when I saw an advertisement for drinkable collagen. I rolled my eyes, convinced it was a ridiculous Instagram trend, similar to skinny teas.
As it turns out, there is validity to ingesting collagen, and quite a few peer-reviewed journal articles back this practice. A study conducted in 2015 reports that middle-aged women who consumed 10 g of collagen per day exhibited increased skin moisture and collagen density. Another study displayed similar results, showing that low-molecular-weight collagen peptide oral supplements increased skin moisture and reduced wrinkles.
Although collagen is synonymous with skincare, collagen plays other pivotal roles in our bodies. In fact, collagen is our most abundant protein. Ongoing research suggests collagen supplementation may assist with bone regeneration, wound healing, and arthritis treatment.
This post is by no means a comprehensive literature review of collagen in skincare. Many additional studies agree with the publications mentioned here. So the next time you reach for a skincare product with collagen, have faith that collagen products aren't so psuedosciency after all.
Bolded Science is on vacation for the month of May. In the meantime...check out some of our older posts.
Considering writing for Bolded Science?
Are you looking to add more writing samples to your sci-comm portfolio? Do you have a science-related cause you'd like to advocate for? Are you conducting super interesting research that you are just dying to tell everyone about? Pitch a blog post to Bolded Science: TheBoldedScientist@gmail.com. It's simple, send an email introducing yourself and 1-2 sentences explaining what you'd like to write about. Then, we will pick a publish date. Once you are approved, write your post and email it 2 weeks before your publish date.
Questions? Please see the FAQ page: https://www.boldedscience.com/faq.html . Question not there? Please send an email and I'd be happy to help.
Thank you to all readers and writers of Bolded Science. This collaborative blog is made possible by our curious, knowledgeable, and passionate community of scientists. In just over a year, Bolded Science has grown to have > 4,000 twitter followers, thousands of blog views, and over 60 blog posts.
Accepting blog posts for the following publication dates:
(First draft is due two weeks before publish date)
All the best,
Kerry Silva McPherson
Creator and Editor of Bolded Science
PS: Please be patient with email replies, I'll be only checking my inbox a couple of days in the week.
What do you do when you no longer need an item? Unwrapping a Mars bar, what happens to the packaging? How much thought do we put into what we throw away? How much thought do we put into waste overall?
Waste is a big problem, well, actually HUGE. Yes, huge is the right word. National Geographic reports that an estimated 8 million tons of plastic end up in the oceans annually. Furthermore, researchers project that by 2050 the volume of mismanaged plastic waste will climb to 155-256 million metric tons per year. But we are not only talking plastic waste today, so the numbers are even higher!
Where could we possibly hoard all this waste? How can we possibly make all this waste? And how can we get rid of it?
"And then there was waste" - the origins
Definitions for waste:
(a) the useless remains of human activity
(b) materials that are useless to the producer
(c) materials that we are willing to pay to dispose of
In no uncertain terms, waste is the stuff we no longer need and are happy to discard. So, how do we handle waste? Well, categories. Since waste is a megaton problem, knowing its sources and properties can help us understand and deal with it better.
Categories of waste depend on the sources, the two overarching categories are (a) municipal waste and (b) industrial waste. Municipal waste is produced in our private lives, homes, areas of recreation, and activities. On the other hand, industrial waste refers to discarded, emitted, and leftover materials from industrial activities. The division is not always clear. For example, hospitals and restaurants are workplaces for staff but places of health services and recreation for patients and consumers, respectively.
We, as consumers of goods, have control over the waste we produce. We, as industrial teams, can control waste industrially produced. We, as the human race, are responsible for 100% of the waste deposited on the planet. The difference between human waste and animal waste is that our waste cannot be absorbed by nature because of its volume and consistency.
Industrial waste is overwhelmingly more than personal waste. A waste report of the UK (population of around 65 million people) for 2011 accounted for 27,300 thousand tons (27.3 million tons) of municipal waste and 41.1 million tons of industrial waste.
The story these numbers tell us is that for every 100 waste items that the UK produced in 2011, 40 waste items were produced in our private lives, and 60 waste items were produced in our business lives... but in millions of tons.
But recycling — you say?
I hear you. Let's take a quick look. There are many countries with recycling schemes, some of them better than others. However, we do not recycle fast enough, we do not manage waste well enough, and technologies on recycling are still at toddler age.
More effective recycling and management can bring numbers down. And the question remains, where does the waste go? Where do we deposit it?
A game of hide and seek – depositing waste
Deposit sites differ according to geographies, management policies, and laws. However, all countries have landfills to bury waste in the ground. In fact, landfills is the most common way of putting waste out of sight.
Some countries even allow depositing waste directly into water bodies. This practice is not necessarily by design. Rather, lack of legislation to make dumping in water illegal and lack of enforcement of current legislation enables large-scale water pollution. Even when illegal, companies, industries, and businesses still do it. Citizens still do it. Remember when you visited the beach with bags full of snacks? Did you remember to pick up the waste before leaving?
Another "eco-friendly" way to manage waste is incineration – we burn it. Incineration is not a new practice and involves a lot of chemistry and social impact. Incineration is now considered eco-friendly because it is used to produce energy. So we no longer just burn it. We burn it with a purpose. On the flip side, the by-products of incineration may include toxic substances or heavy metals. Ashes and pulp are the end by-product of incineration. Pulp is likely to end up in a landfill.
Take-away message: the Earth has afforded us the grace of not thinking about our waste for a few hundred years now, but we can no longer be as thoughtless. The numbers are daunting. When waste goes out of sight, it doesn't go out of existence. Landfills leak into the ground and poison the soil. Ocean life is obviously in danger from pollution, and the air is fast becoming a concern – think of citizens in industrial cities who wear masks all year round due to poor quality air.
In the current times, the lesson learned is that traditional ways to manage waste have gross negative effects. Despite the research, innovation, and promising new practices, the negative still outweigh the positive. More thoughtfulness of the content in our waste bins is necessary because awareness helps initiate action.
Are scientists born or nurtured?
Considering that men predominately hold STEM careers, arguments have been made that women aren’t inherently suited for science. Regardless of where you stand on this topic, this sex gap can partially be attributed to pervasive gender stereotypes in the sciences, specifically what it means to be a “scientist.”
I attribute my early life relationship with science to gender bias. Growing up, I remember loving science programs and toys, but I never considered it for a career. In fact, I avoided participating in science activities throughout school for fear of failure. And the more aware I became of what it meant to be a “scientist,” the more alienated I felt.
Contrastingly, writing, a more decidedly “feminine” pursuit, was where I felt at ease. And consequently, I focused my attention on the humanities as a student. This led me to pursue an undergraduate degree in media, with a dream of doing music industry research in the UK. I still flirted with science, though, by adding a second major, psychology, to my workload. But psychology was my “fun” major. Namely, I was interested in learning about people, disorders, and the brain, but in the “safe” environment of having it be my “second major.”
Nonetheless, despite my “fear” of science, I graduated with a 4.0 GPA in both media and psychology at Penn State, which got me into a great communications graduate program abroad. I loved living abroad; I’ve considered doing it again. But despite it being a positive experience, I was unfulfilled with the research that I was doing. I longed to do research that positively impacted people, much like my dad, who had a PhD in pharmaceutical chemistry. My dad had worked as both a senior industry scientist developing pharmaceuticals and as a pharmacist. I wanted to benefit others the way he had.
At this crossroads, I changed my research direction by applying to PhD programs in health communications in the US. My plan was to relate my background in media and psychology to public health. It seemed like an obvious transition. But despite interviews with Cornell and Michigan State, I was rejected from every school where I applied. Additionally, on the day that I received my final rejection letter, I witnessed my dad dying of an unexpected heart attack while we were shoveling snow. To clarify, my dad wasn’t aware of my rejection letters when he passed and died believing that I was following in his footsteps. Looking back on this, I’m not sure how I could continue functioning as a productive adult. But somehow, the experience helped me thrive.
With nothing to lose, I decided to pursue a Master of Science in psychology. I had always been fascinated by the brain, despite shying away from science. Over the years, I tentatively developed the desire to study the neurobiology of eating disorders. My interest in eating disorders stemmed from being a former figure skater. I was interested in how some individuals developed eating problems while others didn’t. Additionally, during my undergraduate media studies, I became frustrated with the argument that eating disorders resulted from a culture of thinness. While this is partially true, I wanted to understand these disorders from a biological perspective and enlighten the public about their complexity. This career ambition wasn’t something that I readily shared, though. How could I study neurobiology without a science background?
But I was willing to try.
Taking neuroscience courses as a graduate student has been very challenging and, at times, embarrassing. For example, after getting a low C on my first neuroscience exam, I set up a meeting to talk with faculty about switching my degree to counseling. My grade seemed to confirm that I wasn’t a scientist. But I thought about my dad and how proud he would be of me, and I reasoned that all I could do was try. Plus, I had earned a tuition-free scholarship, and I couldn’t give that up.
So, that first semester, I attended office hours with my neuroscience professor every week, sometimes for two hours at a time, and I got an A in the course. In my second semester, I joined one of the few biology-based labs in the program, a pharmacology lab that used serotonin and dopamine deficient mice to study behavior. This experience was symbolic for me, as my dad had worked in pharmaceuticals. It was rough, though. I had zero lab experience, no chemistry knowledge, and had never handled rodents. I fumbled over measurement conversions (I still do) and other basics that the undergrads did easily. Nonetheless, with a lot of humility and perseverance, I completed my thesis on the brain and behavioral effects of fluoxetine on binge eating in serotonin deficient mice in two years.
After earning an MS in psychology, am I a scientist? I didn’t feel like one after going through another cycle of PhD rejections, this time for neuroscience. I was tired. I’d worked too hard to give up, though, and focused on bolstering my CV. I volunteered in an eating disorders lab at Temple and proactively contacted potential advisors. This third round of applications stuck, and I am currently pursuing a PhD in neuroscience at Purdue, focusing on the neurobiology of eating disorders.
As a PhD candidate in a STEM field, am I a scientist? Occasionally I still have significant moments of imposter syndrome. Although, my perception of a scientist is changing.
I’ve realized that science is a messy, mistake-laden series of fields. The expectation is to follow perfectly laid out steps and complicated calculations, but we try, fumble, and try again. We improvise and problem-solve and often have no idea what we’re doing. Even more importantly, I’ve realized that the atmosphere of exclusivity and masculinity that enshrouds the scientific community is unnecessary. We need to communicate these misconceptions about what it means to be a scientist to the public to increase accessibility and diversity in science. By shattering the stereotype of what it means to be a scientist, we have the potential to broaden contributions in scientific enterprise while increasing the trust of science skeptics.
My advice for those who enjoy science but have avoided it because it “wasn’t their place”:
Looking towards the future, finding a career that highlights my diverse skillset is challenging. However, carving out new spaces for scientific enterprise is one way to update what it means to be a scientist. I hope that you, too, will be inspired to follow your curiosity down new paths, undeterred by failure but emboldened by discovery.
A post in collaboration with The Shared Microscope
Caution: Smoking tobacco is harmful to health, and as most of you know, is carcinogenic. This article in no way encourages to use of tobacco as protection against COVID-19.
In response to the COVID-19 pandemic, several drug companies, including AstraZeneca, Moderna, Pfizer, Johnson & Johnson, Sinovac, and Novavax, have successfully created vaccines that are currently being deployed to individuals worldwide. Medicago and GSK have also joined forces to develop a unique vaccine against COVID-19.
Medicago, a biopharmaceutical company headquartered in Canada, focuses on developing plant-based therapeutics in response to global health challenges, COVID-19 being no exception. In response to the pandemic, Medicago (alongside GSK) developed a COVID-19 vaccine unique to all the other vaccines recently developed against COVID-19. So what's special about the vaccine? Interestingly, this is the first COVID-19 vaccine that is entirely plant-based. Yep, you heard that right — made in plants! Tobacco plants (ish)!
What plant is the Medicago COVID-19 vaccine made in?
Medicago's vaccine is made from a close relative of the tobacco plant, Nicotiana benthamiana. Ironically, tobacco has plagued humans with lung conditions ranging from COPD to lung cancer for decades. Surprisingly, this notorious plant has been critical in manufacturing vaccines against COVID-19-related pneumonia in the past year.
Plant-based vaccines are cheap to produce, safe, and rapid for developing vaccine ingredients such as the Coronavirus-like particles. Plant-based vaccines are also highly scalable and provide manufacturers with increased flexibility. As highlighted by the peer-reviewed paper linked here, the Nicotiana benthamiana is the core production host for various companies aiming to further human medicine, including Medicago, Icon Genetics, iBio, and Lead Expression Systems.
Nicotiana benthamiana is often the production host of choice because it has a defective plant immune system, making it easy to produce new vaccine ingredients. Research suggests that the plant sacrificed its defensive system in favor of a hastened reproduction cycle. A strategy that enabled the plant to cope with severe drought weather in central Australia. This susceptibility to infection enables the plant to quickly undergo genetic transformations and transient gene expression, making it an excellent mini-factory for protein production.
How are plants used in vaccine development?
For over a decade, Medicago has been developing a technology that harnesses the power of plants to develop vaccines. More specifically, using a bacterium, the plant is "fed" information about a virus to produce the main ingredient of a vaccine (also known as the active ingredient).
The plant is programmed to produce the vaccine -- to do this, Medicago only needs the virus's genetic sequence rather than the virus itself. The genes are then introduced to the plant via a bacterium. The plant's normal machinery will produce the vaccine ingredient against the virus through a natural process within the tobacco plant. The plant can produce the vaccine's active ingredient in approximately one week from the date of initial exposure to the pathogen. The active ingredient is then harvested and purified for use in a vaccine candidate. The vaccine candidate can then be produced within 5 to 6 weeks.
For the coronavirus vaccine in development, Medicago focuses on producing Coronavirus-Like Particles that mimic the virus's structure. Please note: although the Coronavirus-Like Particles are structurally identical to the SARS-CoV-2 virus (which causes COVID-19), they do not contain any of the genetic material of the virus, and therefore, are unable to cause infection.
To learn more about the COVID-19 vaccine in development by Medicago, feel free to check out the following video:
However, the production of the vaccine candidate is only half the battle won: the vaccine candidate then has to pass various safety and efficacy tests before the vaccine can be commercialized for use in humans. This testing process is further explained in an article linked here.
Tell me more about the Coronavirus-Like Particle Technology
Virus-like particles, such as the coronavirus-like particles manufactured by Medicago, are structurally identical to wild-type viruses. However, they lack the genetic material inside the virus. Because of this, the virus-like particles are unable to replicate or cause infection in the vaccinated individual.
Below is an image of Medicago’s CoVLP compared to an image of a wild-type SARS-CoV-2 virus:
It can be argued that vaccines produced in plants are faster and more accurate because no manipulation of the virus is required. The vaccine development also does not require that viruses be handled in the laboratory. Research so far suggests that virus-like particles have had an equivalent or superior immune response in mice when compared with live viruses.
Virus-like particles cannot develop any mutations and are also structurally stable, unlike mutations that may sometimes develop from traditional vaccine manufacturing. The virus-like particles can elicit an immune response via antigen-presenting cells (a type of white blood cell) found in the human body, leading to a robust immune response in the vaccinated individual.
The virus-like particles fool the immune system to protect an individual from disease. Virus-like particles look like the disease-causing virus that can trick the immune system into making antibodies that will protect against the actual infectious pathogen if naturally exposed to it.
Isn't Medicago working with GSK for this vaccine?
Yes, Medicago has joined hands with GSK for the development of their COVID-19 vaccine. As part of the partnership, Medicago manufactures the active ingredient to be used in the vaccine. GSK provides its pandemic vaccine adjuvant system for the same.
What is the importance of the adjuvant? The adjuvant plays a vital role in the pandemic and has done so amid the last flu pandemic of 2009. The pandemic adjuvant reduces the amount of vaccine protein required per dose, allowing more vaccine doses to be produced overall. In other words, the Medicago vaccine can be “diluted” using the Medicago pandemic adjuvant, which can help to protect more people overall. This “dilution” using the adjuvant enhances the immune response and provides long-lasting immunity against infection.
Is the Medicago vaccine vegan?
This seems like an apt question here. Is the Medicago COVID-19 vaccine really vegan? Medicago has not directly responded to this question (yet). But one thing is clear, the active ingredient of the vaccine, the coronavirus-like particle, is not of animal origin. More information is required about all the other ingredients used in the vaccine before we can certainly comment about whether or not the vaccine is vegan.
What's the current situation?
The Candian company, alongside GSK and Philip Morris, recently reported promising results from their Phase I and II clinical trials. The vaccine is now in the final phase of human trials. To learn more about the vaccine development process, feel free to check out this article here.
Depending on the results of the Phase III trials, the company has reached an agreement with the Canadian government to accelerate its COVID-19 vaccine candidate efforts. The government has agreed with the company to supply 76 million doses of its COVID-19 vaccine within Canada.
Carpe Fiscus! The time is ripe to stimulate the "pathway to independence" for early career researchers.
In August 2017, faced with the increasing probability of drastic budget cuts under the Trump administration, the NSF Directorate of Biological Sciences announced it would no longer be funding its long-praised Doctoral Dissertation Improvement Grants (DDIGs). This decision marked a turning point in funding opportunities for graduate students, who are particularly vulnerable to a lack of grant options. As a graduate student at the time, I wrote about the decision and its ramifications for myself and my peers, citing the NSF's choice to slash the program as a brand of "trickle-down" academics. I noted that the decision to cut DDIGs was worrisome for early-career researchers, heralding a further consolidation of academic power at the very top levels of the hierarchy and diminishing agency for already-vulnerable trainees.
Just under four years later, the world and I have both moved on – I passed my dissertation defense and began a new chapter in my academic training. The US handed the reins of power over to a markedly different administration, one constantly challenged by the lingering watermarks of its inherently anti-science predecessors. With this country-wide transition and the eyes of the world increasingly on academic research during a global pandemic, we sit at the cusp of an incredible opportunity to push funding opportunities for early-career researchers further than ever before.
Funding for Early Career Researchers
Most graduate students in STEM are funded through a combination of research and teaching assistantships, the money for which comes from grant and institutional funding, respectively. Grant funds are often awarded directly to the student's principal investigator or PI — the person directly responsible for mentoring and supervising graduate students. Teaching assistantships are often seen as less desirable by the students and their PIs, as teaching takes time away from research. Therefore, research assistantships are prized but depend entirely on the PI for funding and tend to be fairly restricted in subject matter. The PI faces pressure to publish and present on the experiments laid out in the original proposal as this evidence of success is instrumental in future funding decisions; there is usually little room for creativity or independence from the grad student.
Funding opportunities for postdoctoral fellows are often similarly awarded to the PI rather than the fellows themselves, with fairly rare exceptions. The NSF has a single program aimed solely at biology postdoctoral fellows called the postdoctoral research fellowship in biology (or the PRFB), while the NIH, the other major funder of foundational research in the US, has several opportunities geared directly at postdocs, including their F32, K25, and K99/R00 awards. Competition for these awards is fierce. It is much more common for postdocs to be funded as a component of grants awarded to established PIs, leaving little opportunity for postdocs to control the direction of their research. Instead, they remain bound to research that can directly tie into the goals of the grant they are funded under – goals which they may have had no hand in setting.
Rightly or wrongly (and it would be the topic of a whole other post to unpack whether it's right or wrong), we predominantly train our graduate students and postdocs to be future PIs. PIs must be able to independently find funding opportunities, generate innovative research proposals, and follow through on those proposals' aims. These steps all involve thinking creatively, responding to unpredictable events, and adjusting accordingly. While funding decisions are based strongly on publication record, they also depend heavily on a proven track record of securing independent funding. Depriving our early-career trainees of opportunities to establish a funding track record makes their professional and academic journeys much harder. It takes away their agency, leaving them more reliant on their PI.
Hopefully, if you've read this far and are still invested in reading further, I've convinced you that funding opportunities for early-career researchers should be expanded. What can we all do to make sure that this expansion happens? Well, the answer will likely depend on who you are and your specific role within publicly-funded research.
For fellow academics, especially those further along in their careers:
While most people are aware of the threats posed by climate change, few know of just how drastic those threats are to biodiversity. According to the World Wildlife Fund (WWF), the Earth loses roughly 10,000 species every year, roughly 5,000 times higher than the natural extinction rate.
While zoos are an effective way to house endangered or threatened species, the reproductive biology of these animals is largely unexplored but is becoming increasingly important for species conservation. Two of the most pressing issues facing zoos today are space and lack of genetic diversity. Even when zoos are well-managed and internationally connected, zoo populations rarely contain large enough animal populations for long-term sustainability. Moreover, when new animals are brought in to revitalise captive population genetics, the logistics of moving animals between zoos can be extremely challenging (imagine the logistics and costs of moving an elephant or giraffe from New Zealand to New York). This is where assisted reproduction can play a significant role.
What is assisted reproduction?
Broadly speaking, assisted reproduction involves managing an animal's reproductive cycle or manipulating gametes to achieve fertilisation and a subsequent pregnancy/live birth. Some of the most common assisted reproductive techniques in our arsenal are gamete cryopreservation, artificial insemination, and in-vitro fertilisation (IVF). Assisted reproductive techniques have become very well defined in humans that, since the birth of the world's first IVF baby in 1979, around 8 million children have been born from assisted reproductive techniques globally. Assisted reproductive techniques have also become so commonplace in laboratory rodents and farm species that we often forget the incredible difficulty in defining the fundamentals of a novel species' reproductive biology. Unfortunately, this is exactly the case with many endangered or threatened species. Even artificial insemination, one of the more basic assisted reproductive techniques, requires an in-depth understanding of male and female reproductive physiology before we can even think of making an attempt. Although daunting, once even simple techniques like AI or reproductive cycle management are defined, assisted reproductive techniques can be incredibly useful in supporting captive breeding efforts.
As I mentioned earlier, the difficulty of transporting some animals between zoos (let alone continents) is extremely challenging. Sperm cryopreservation is an effective procedure for many species, where semen is collected either voluntarily or through electroejaculation and frozen without dramatically affecting sperm viability. Similarly, even cells from wild individuals can be collected, frozen, and used in captive breeding programs. Cells frozen correctly can (in theory) remain viable forever and be shipped around the world far more cheaply and simply compared to shipping an entire animal. Several institutes around Australia, including the Taronga Conservation Society and Monash University, have adopted this idea and have established the futuristic concept of a 'Frozen zoo.' Frozen zoos store cells from endangered animals and plants in liquid nitrogen until they're needed for future genetic reintroduction programs into captive or wild populations through techniques such as artificial insemination or IVF.
I think it needs to be clearly stated that assisted reproductive techniques never intend to (or I think ever will) replace captive breeding. Assisted reproductive techniques are tools that scientists, conservationists, and zoo staff can use to more effectively increase captive animal numbers without replacing traditional breeding methods.
Have frozen zoos and assisted reproductive techniques been useful before?
In practice, assisted reproductive techniques are rarely used in captive settings due to their technical complexity and perceived costs. However, assisted reproduction continues to make headlines in scientific literature and the media, including artificial insemination in giant pandas and jaguars, cryopreservation in coral and fish species, and, most recently, the cloning of black-footed ferrets from cells frozen over 30 years ago. While it may seem drastic to start cloning rhinos or freezing sperm from lions, climate change poses incredible threats to species biodiversity, which we are doing a terrible job in mitigating. The Earth is losing roughly 10 million hectares of forest every year, and, as a result, animal populations are becoming increasingly fragmented and isolated, limiting gene-flow between populations. By not having enough genetic diversity between populations, a species can suffer from inbreeding depression: the reduced biological 'fitness' of a species and their ability to reproduce and survive in the wild. Reliable techniques for preserving and transporting species genetics between captive settings (or from the wild to captive settings) enable better management of genetic diversity while increasing that species' biological fitness.
So, what does the future of assisted reproduction look like?
While assisted reproductive techniques have clear immediate and future benefits to species conservation, their use is unfortunately not up to the conservationists and scientists but up to funding bodies and political big wigs.
The importance of assisted reproductive techniques in the future of species conservation cannot be understated, and researchers continue to build the case for assisted reproductive techniques as reliable, effective tools for the protection of biodiversity. Conservationists and assisted reproductive biologists have chosen a difficult career, often restricted by funding issues and a pervasive misunderstanding of the importance of biodiversity in the general population. Although everybody loves the trailblazing, revolutionary discoveries, or achievements in science, these discoveries are only possible after decades of fundamental research. Without the proper funding or public interest in biodiversity, species conservation will remain an incredibly tough, arduous field. That being said, although progress may seem slow, if we continue to fight the uphill battle against climate change, we will be glad we invested in assisted reproduction science when we had the chance.
How long do you think is an appropriate time for students to commit to their PhD? If you ask around, the perceived time range varies quite a bit, 3-4 years, 4-6 years, or even double-digit years. If we can't agree on the time length of a doctoral degree (like Med, Pharmacy, and Law school), there must be other cemented parameters that guide students to graduation? .... Right....?
How do you know when you are ready to graduate?
Most STEM doctoral students travel a similar path. They conduct research until their project is complete, then after writing a thesis and defending it, they are conferred with the title of "Doctor." The question is, when does one actually "complete" a research project? Research is never done. One question is answered, which leads to another question, which leads to another and to another.
So how does one judge if a graduate student's work is finished? I tweeted out this question sometime ago and received various answers. Most responses were number of publications, completion of proposal aims, or the amount of years in the program. A few other less common answers were, "vibe," "loss of funding," "up to the PI" and....
So let's break the most common answers down:
Number of publications: Many programs require 1-2 publications for a student to graduate. Often, this is considered a fair. Other times, it puts the student at a disadvantage. This parameter neglects to normalize the support systems within the lab. Some students have no lab-techs, post-docs, or collaborators, meaning that publishing is a much more arduous task than their counterparts. If a student is lucky, they receive a project that is low-risk or partially finished, while other students work on high risk projects for years without a payout.
Completing proposal aims: At the surface, this seems to be an equitable stipulation for graduation. But often, students write proposals off-topic, rendering this parameter impossible. Frequently, projects evolve and take the student in another direction than their proposal. If they veer off path for the sake of science exploration, should they be still held to the same proposal they wrote years ago?
Years in program: Not all projects are created equal, and not all students put in the same effort over the same period of time. However, I argue that putting a time cap drives productivity, encourages streamlined research, and motivate PIs to support their students in finishing their projects.
Graduate student labor.
Graduate students are cheap labor, most making about $30,000 a year. PIs are reluctant to allow students to graduate, thereby forfeiting a valuable resource. Besides the project a student is working on, a grad student is also expected to train incoming lab members, maintain lab equipment, contribute to lab chores, and work on side projects other than their thesis.
We have to ask ourselves if more years in the same program with the same mentor is beneficial to a student's education and training. Is a seventh-year student still learning from their mentor, or are they underpaid employees receiving typical on-the-job training? Furthermore, extending a student's PhD training can certainly have setbacks.
Time at school should not be taken for granted.
A long PhD can have detrimental effects on a students' life and career:
From the many student-PI conflicts I've seen, it's naive to believe the current system of arbitrary graduation guidelines is working. To give more protection to graduate students, I propose two policies: (1) A 4-year time cap for students with a Master's degree or a 5-year time cap without a Master's degree, and (2) salary raises for students throughout the program.
Providing a time cap.
With no time cap, graduate students are encouraged to work on non-thesis research and to tackle high-risk projects that are likely to not pan out. A time cap can motivate both the student and the mentor to come up with a practical project plan and remain focused at the proposed thesis work needed for graduation. Consider Parkinson's Law which states that work will expand to fill the time allotted. Under this principle, a student will finish their dissertation research in four years if given a four year time limit. Without the time limit, a student will linger in the program until an external factor (funding, unhappiness, or a job offer) influences them to wrap up their project.
However, there are plenty of reasons to fight against a time cap: variability between disciplines, discrepancies in work ethic, and neurodiversity of students. This is why I recommend a 4-5 year time cap with an opportunity to extend. Extensions can be offered to students with disabilities, or if a student, PI, and thesis committee mutually agree staying in the program is beneficial.
Demanding raises for long-standing students
Paying senior students more money is another way to ensure equity to graduate students. Senior students take on more responsibility and are [usually] more skilled. Granting raises rewards students for more time in the program and also removes the cheap labor bias.
It's not all the PIs fault.....
It's easy to blame the thesis advisor. But if PIs aren't given the proper tools and support system to keep a lab running smoothly, can we blame them for wanting to keep students on longer? Here's a few ways institutions can support PIs more:
I am aware that many European Universities have time limits. This post is written by an American graduate student whose program has a loose 7-year limit. I am a fourth-year student who intends to graduate during my fifth year. Even with three pubs (one published, one submitted, and one underway) and a Master's degree, I receive pushback that I should stay well into my 6th year. I'm convinced that more time in my PhD will not further my education or career prospects, but I am certain that it will affect my financial and mental health.
We’re a year into this pandemic, and although the numbers seem to be improving, video-chatting is here to stay. Even once we reach “normal,” the convenience and flexibility of virtual meetings likely means we have plenty of web-based interactions in our future.
So, when it’s time for you to plan your next zoom event, here are a few things to consider:
1. Schedule time for IT issues. Plan on time for connectivity issues and microphone checks for all speakers. And if using multiple speakers and break out rooms, plan for some adjustment time. Don’t expect all changes between speakers to happen smoothly and instantly.
2. Schedule breaks. Just because we are sitting at our laptops, doesn’t mean we don’t need coffee, lunch, and bathroom breaks. During long meetings, your participants' brains likely need a break from that dense content.
3. In person format ≠ zoom format. I’ve participated in a few events where the organizer took the same schedule from previous in-person events and used it over webchat without modification. With the lack of in-person socialization, the audience members likely have decreased attention spans. So, for web-based events, less is more. Consider shortening your format. We don’t want to stare at our computer screen for a full day!
4. Ask questions to improve engagement. You may have noticed there’s less questions and participation during webchat events. Utilize the poll functions and don’t be afraid to have fun quiz questions with your audience.
5. Take advantage of break out rooms (when necessary). Break out rooms can be great to facilitate conversation, ie: for panel discussions or to answer discussion questions. But gage your audience, will breaking out into smaller rooms facilitate more conversation? Or will it dilute the pool of participants likely to actively participate?
6. Say good-bye to weekends. Often, conferences may be held on a weekend due to availability of parking, hotels, and conference rooms. But with the internet, availability is endless. Please organizers, leave my weekends alone.
Although we complain about zoom-life, I love it! I can easily meet and talk with people around the world. Covid-19 brought about terrible atrocities, but at least it acclimated us all to the lovely world of video-chatting.