Scientificjournal: Frequently Asked Questions
Scientific journals sit at the center of how knowledge gets validated, shared, and built upon — and the mechanics behind them are surprisingly intricate. These questions cover the fundamentals of how journals work, how articles move through the publication process, how journals are evaluated, and where the system tends to break down. Whether the goal is submitting research, evaluating sources, or simply understanding why a paper is paywalled, the answers here are grounded in how refereed academic publishing actually operates.
What does this actually cover?
Scientific journals are periodicals that publish original research, reviews, and technical communications in specific disciplines — or across disciplines in the case of outlets like Nature or Science. The homepage for this resource covers the full landscape: from how individual articles are structured and evaluated, to how journals themselves are ranked, indexed, and accessed.
The scope runs from mechanics (what happens to a manuscript between submission and publication) to infrastructure (how databases like Scopus or Web of Science decide which journals to index). It also covers the economics — article processing charges in open-access models can run from a few hundred dollars to over $11,000 at journals published by Elsevier or Springer Nature, a range that shapes which researchers can publish where.
What are the most common issues encountered?
Three issues dominate: rejection without review, access barriers, and predatory journals.
Desk rejection — where an editor declines a manuscript before it ever reaches a peer reviewer — accounts for a substantial share of all submissions at high-volume journals. PLOS ONE receives tens of thousands of submissions per year; journals like Cell desk-reject the majority of what arrives.
Access barriers affect readers rather than authors. Most subscription journals sit behind paywalls, and institutional licenses vary enormously. A researcher at a well-funded R1 university has access to a fundamentally different library than a clinician at a community hospital.
Predatory journals represent a third, more corrosive problem: outlets that charge publication fees while performing little or no peer review. These journals corrupt the scientific record by publishing unreviewed claims under the visual grammar of legitimate science.
How does classification work in practice?
Journals are classified along at least 4 primary axes:
- Discipline — subject-specific (e.g., Journal of Neuroscience) vs. multidisciplinary (e.g., PNAS)
- Access model — subscription-based, open access, or hybrid
- Publication frequency — weekly, monthly, continuous (publish-as-accepted)
- Review type — single-blind, double-blind, open peer review, or post-publication review
The types of scientific journals page goes deeper on each category. Within discipline classification, secondary distinctions matter: a letters journal (like Physical Review Letters) prioritizes short, high-impact communications, while a methods journal (like Nature Methods) focuses specifically on technique development rather than findings.
What is typically involved in the process?
The manuscript submission process follows a recognizable sequence, though timelines vary considerably by field and journal:
- Preparation — formatting to journal-specific guidelines, preparing cover letters, selecting suggested reviewers
- Submission — upload through editorial management systems like Editorial Manager or ScholarOne
- Editorial screening — scope check and plagiarism detection (typically 1–5 days)
- Peer review — 2 to 4 external reviewers assess the work; median review time across fields is roughly 40–60 days, though some journals average over 100 days (Publons / Clarivate data)
- Decision — accept, minor revision, major revision, or reject
- Revision and re-review — major revisions typically return to at least one original reviewer
- Production — copyediting, proofing, DOI assignment, and online publication
The DOI and persistent identifiers page explains how articles receive their permanent web addresses at the final stage.
What are the most common misconceptions?
Impact factor equals quality. The Journal Impact Factor, maintained by Clarivate, measures average citations per article over a 2-year window — not scientific rigor. A journal in a citation-dense field will mechanically outscore an equally rigorous journal in a slower-moving discipline. Impact factor and journal metrics covers this in detail.
Peer review catches everything. Peer review is a filter, not a guarantee. Reviewers work unpaid, under time pressure, without access to raw data in most cases. The retractions and corrections page documents how significantly flawed or fabricated work can persist for years before retraction.
Open access means free to publish. Open access means free to read. Publishing open access often requires authors (or their institutions) to pay article processing charges upfront. The federal open-access mandate in the US requires federally funded research to be publicly accessible, but it doesn't eliminate those charges.
Where can authoritative references be found?
Primary sources for journal-level data include:
- Clarivate Journal Citation Reports — impact factors, quartile rankings by category
- Scimago Journal & Country Rank (SJR) — free, Scopus-based rankings explained at Scimago Journal Rank explained
- DOAJ (Directory of Open Access Journals) — vetted list of legitimate open-access titles
- PubMed / MEDLINE — NLM-maintained index for biomedical literature; inclusion requires meeting journal indexing standards
- Retraction Watch Database — tracks retractions across disciplines
For author-level metrics like the h-index, the h-index and citation metrics page covers how those numbers are calculated and where they mislead.
How do requirements vary by jurisdiction or context?
Publication standards shift significantly depending on funder, institution, and geography.
In the United States, the 2022 OSTP memo directed all federal agencies to eliminate the 12-month embargo on publicly funded research by 2025, effectively expanding the scope of the federal open-access mandate. The NIH public access policy, which predates that memo, already required deposit in PubMed Central within 12 months of publication.
In the European Union, Plan S — coordinated by cOAlition S — requires immediate open access for research funded by participating agencies, with no embargo period permitted. This creates a different compliance landscape than US rules, and researchers funded by multiple sources may face competing requirements simultaneously.
Institutional policies layer on top of funder mandates. Some universities maintain copyright and licensing policies that retain certain rights for the institution, which can affect which journals an author is permitted to publish in at all.
What triggers a formal review or action?
Post-publication scrutiny escalates through recognizable triggers. A single expression of concern from a reader or reviewer rarely produces a retraction on its own — but 3 patterns tend to accelerate formal action:
- Image manipulation detected — tools like ImageTwin or Proofig have made automated screening standard at journals including Molecular Cell and outlets under the COPE (Committee on Publication Ethics) framework
- Data unavailability — when authors cannot produce underlying data upon request, editors may issue corrections or initiate retraction under data availability and reproducibility policies
- Institutional investigation — if an author's home institution opens a research misconduct inquiry, journals are typically notified; most major publishers have procedures for coordinating responses
Research ethics and publication standards outlines the COPE guidelines that most reputable journals follow when handling allegations. COPE's flowcharts — publicly available at publicationethics.org — have become the de facto procedural standard for editorial decisions involving suspected misconduct.