Community Central
Community Central
Hydra-tower

The layers in wiki pages go unnoticed to readers, but are important for all sorts of reasons.

Part of the fun of wiki editing is the flexibility and interactivity that come from templates and scripting, by building internal and external layers (respectively) onto page content. The more involved into wiki community building editors go, many will get involved with creating layers deeper or atop the wiki page through those templates and scripts.

While the most important parts of a community are the people involved and the content inside, these additional layers have best practices recommendations, too. Like icing on and filling in a cake, a solid foundation of content deserves details to make things interesting.

JavaScript and Performance

JavaScript (sometimes abbreviated as "JS") can be a great way of adding some fun flair to communities. Because of how it is distributed, most JavaScript code (usually referred to as a "script") runs inside the viewer's web browser. These scripts are recipes that change the way a page acts, and often create new elements on a page. Because of how frequently they would be accessed (every time a page loads), these scripts are stored in a process called the "cache" on both our servers and in the viewer's web browser. That way, potentially large scripts don't have to be reloaded every time and can get updated only periodically (which can be anywhere from instantly to every few hours to a year). If you are adding JavaScript to your wiki and don't see changes right away, you are probably encountering a cache issue.

JavaScript powers many functions both in the skin around your content and the advanced content users provide with personal or site-wide scripts. Fandom makes an active effort to reduce the amount of JavaScript we supply in the skin, because every bit we add has to be downloaded to a viewer's web browser. This can have an impact on the time that it takes to load a community, which is noticed both by users (particularly those on slow or unstable connections) and search engines. Page Speed is Google's term for the amount of time it takes to download all of what goes into a page (including the scripts), produce a meaningful display on a screen, and have that page ready for users to interact with; all of these are important factors for users that feel a delay before they can do anything is a bad experience. The complexity of how much memory a browser uses to load and run scripts is also a factor, which we try to reduce where possible. Both user experience and search engine ranking (based on SEO) have direct impacts by the amount and complexity of JavaScript on wiki pages, particularly for logged-out users.

Being mindful of how much is added, including those scripts imported from Fandom's user-led developer community, is itself a best practice. Experienced scripters have not had to be as concerned in the past with how the content of their scripts affected SEO, because search engines would ignore scripts. However, this is no longer the case. Google now uses a different engine to examine communities, which includes downloading and unfolding JavaScript. Adding HTML elements with JavaScript now places Google's community examination (also called the "crawl") into a delayed queue, which can extend processing time by weeks. Page Speed also plays a more impactful role in ranking as a result.

Interactive Content

JavaScript can do some amazing things to wiki pages. It can make elements appear seemingly out of nowhere or respond to user actions, like pressing buttons or entering data into fields. The magic happens inside the browser to shape what's on the page.

There are several scripts, particularly additions from Fandom's fan developer community, that add or change the way the skin works. There is no harm in adding these for your personal use when there are tools that make your reading or editing experience more enjoyable or easier. Some scripts are strictly marked for site-wide use, as opposed to personal use. Those marked for site-wide use will reach anonymous users and will have SEO implications — therefore, less is more.

The most common uses of JavaScript that few think about include showing and hiding content, tabbers, and TabView. The mw-collapsible method of collapsing elements is well established, and built into MediaWiki itself; therefore, it is built into the mobile skin and works properly in most circumstances. The same caveats we have mentioned in earlier posts about hidden content apply, and content that is collapsed signals to search engines and readers (rightly or wrongly) that what's not shown on first load is less important. Tabbers magnify the effect by hiding multiple layers from immediate view.

As a best practice, we recommend that JavaScript not be used to change the main content area nor to add elements directly inside the content area. The primary reason for this is that JavaScript not built into Fandom's mobile skin is not employed in mobile views, which means much of your intended content will not be seen by a large portion of your viewers if you insert it only with JavaScript. From a user experience perspective, a block of something interactive in the middle of a page is distracting; an otherwise empty page (as would be seen on the mobile skin) is outright confusing. As an alternative, we suggest anchoring pure JavaScript objects like statistical or game calculators into a Special page, where they will not be affected by the mobile skin nor crawled by search engines.

Some JavaScript tools rely on pulling information from other wiki pages, a process known commonly as AJAX. The TabView extension uses AJAX on the desktop skin to place content from other pages into tabs that are created using JavaScript. The content that is displayed is crawled by search engines more than once, and Google indexing (listing on the results pages) is further delayed by the new crawling method explained above. The impact on readers is that they don't know they're seeing the same content in multiple places, and editors have more difficulty pinpointing where content lives. As a result, where we at Fandom once saw good portable use of TabView, we no longer feel the benefits outweigh the drawbacks and we suggest reducing the use of it and other AJAX-based scripts. As an alternative, we have outlined in previous posts the recommended split-and-link method that we believe produces stronger content overall that is easier on readers and editors.

Templates

Templates are the internal layers that make for reusable elements. Templates can be created with wikitext, sometimes in combination with internal pieces of software called ParserFunctions or extensions. An example of templates made with extensions are Fandom's Portable Infoboxes, which create a fast and fairly standard way to make a common element. ParserFunctions are utility software pieces that work as a small army of tiny tools changing or interpreting text before passing them along to the reader's browser.

A template that calls another template is called a "nested" template. To explain how this works and why it's important, the software that runs our wikis processes multiple passes where it expands templates into whatever their result is. If that result has more templates, more passes are required. This delays processing time and can be confusing for editors if they need to find a problem and need to dig through multiple layers. We recommend limiting to 3 or fewer layers of template nesting where possible.

It's always a good idea to label and comment your templates, when you make them, so that those who edit after you know what they are looking at. It is also a good idea to have user-friendly parameter names (those are the parts of templates that address specific labels, like | name = Jon Snow). New editors will not always understand that ATK means "attack" (as is fairly standard in English-languages games); or worse, ATK without explanation means something other than "attack", which can be even more confusing.

Templates can also be created with a different kind of scripting language, called Lua. These can be very powerful and very fast, replacing many layers of nested templates and ParserFunctions. They are designed to be expanded and interpreted in a single pass, so that good Lua functional sets (called Modules) will import other modules, rather than expanding into wikitext and expanding again. This reduces the nesting functions issues, and usually makes the code easier to understand. It also provides much faster speeds than using ParserFunctions alone. However, please keep in mind that Lua is not for beginners and committing to a Lua path may mean fewer people being able to understand your code or fix it if it breaks and you are not available. Also, for simple templates, Lua can be overkill where the same effect can be accomplished by classic wikitext.

As a side note: many wikis add a [[Source]] or V·T·E link to their templates to direct them to the template documentation. Doing so has no proven user experience benefit, as those links historically have very few humans clicking through them. They also provide a visually inconsistent break in the template display that makes them more problematic to experience. Finally, the link provided forces search engines to attempt to crawl multiple times the template documentation code that we intentionally hide from crawling, resulting in the search engine experiencing a "404" or dead page, reducing the perceived quality and authority of the community in question. We strongly suggest removing these links in templates intended for use on article pages.

Automation

Wiki editors have long had to work with small amounts of input text and make as much output of it as possible. This sometimes results in having templates process the bulk or entirety of some pages. There are multiple failsafes within the server code to discourage using templates to produce a lot more output in relation to the input, and one of note is the "post-expand include size". For security reasons, this avoids having a massive output ratio from little template parameter input. A big failsafe is when pages become too complex or too hard for the server to process, they simply become inaccessible or in-editable. At that point, Staff intervention is often required and portions of the page should be split off to reduce complexity.

Smaller interventions can also have subtle but important effects. For example, relying on an infobox template calling a specific image based on the article's page name can seem very simple, but becomes more complicated if the article has a non-standard name or is copied to another page for testing.

Much of the habit of automation is geared towards making pages more simple for editors. It often has the effect of making editing simpler only for the small subset of editors that fully understand a process that is not always intuitive. New editors should, ideally, be able to understand what goes into a system with minimal education about the inner workings. The more complex any system in science or engineering becomes, the more working parts can potentially fail if the input is not entirely standard. As an added benefit, finding and entering information rather than having an automated system computing it for readers encourages more edits because the work is not already done!

Finally, it's a good practice to try to place as much content outside templates into the basic canvas of the article, because doing so simplifies the editing process. It may help to think of the basic article content as "organic food" and being a good idea to have it processed as little as possible and with as few steps as necessary before it is ready to eat. No more than an average of 30% of your basic text should be processed by templates. Automation should represent only the tip of the proverbial iceberg, because it makes editing itself more accessible to everyone.

Keeping it simple

The tl;dr version
JavaScript can be a powerful addition to a wiki, and transcend the limitations of wikitext. Too much of a good thing can also make processing take longer, both for users and search engines. Templates are the connective tissue of wiki pages, and help content to shine. Making them too deep or too complicated can confound our systems and potential editors.

Best Practices moving forward

We've had a great start in 2019 with this Best Practices series, and have reached many communities with both timely and evergreen advice. Based on feedback, we're going to change the format moving forward; this will be the last post of the year, and we intend to come back refreshed in 2020. We hope that these have been helpful to you as suggestions and guidelines for communities both new and well-established. The conversation is always open in our Discord server's #best-practices channel, and we invite you to participate with questions and concerns.


FishTank-100×100
Fandom Staff
Isaac rose from the ranks of Fandom contributors to join the Community Technical team in late 2015. He is now an Editor Experience Specialist, with a focus on User Education. Isaac is a television and book fanatic, a sucker for the great outdoors, and a lifelong learner. He's been coding since before attending school but didn't discover Fandom until 2010. Even now, he's hard-pressed to identify his favorite fandoms.
Want to stay up to date on the latest feature releases and news from Fandom?
Click here to follow the Fandom staff blog.

Click here to sign up for the From the Desk of Community email newsletter.

Want to get real-time access to fellow editors and staff?
Join our Official Discord server for registered editors!