----MODIFICATIONS TO CITATION/CS1 FOR LEGAL CITATION TEMPLATE CODE ADAPTED PER WikiMedia CC-BY-SA (whatever version, doesn't matter, not a great code license) WITH HOPES OF MERGE. (All base CS1 code edits will begin and end as this comment, with username removed in Template-space release.) --
require ('strict');
----------------------------< F O R W A R D D E C L A R A T I O N S >--------------------------------------
each of these counts against the Lua upvalue limit
local validation; -- functions in Module:Citation/CS1/Date_validation
local utilities; -- functions in Module:Citation/CS1/Utilitieslocal z =; -- table of tables in Module:Citation/CS1/Utilities
local identifiers; -- functions and tables in Module:Citation/CS1/Identifierslocal metadata; -- functions in Module:Citation/CS1/COinSlocal cfg = ; -- table of configuration tables that are defined in Module:Citation/CS1/Configurationlocal whitelist = ; --mw.loadData ('Module:Citation/CS1/Whitelist'); -- = ; -- table of tables listing valid template parameter names; defined in Module:Citation/CS1/Whitelist
----modifying CS1/Whiespace here so we don't have too many mirrors; we must copy data to a new table, but that's a lot of little syntax changes and error checking in this doc, so we'll just copy the static CS1 tables. No release version can reallocate large tables on every call. local citelegal_basic_args = --local citelegal_unique_args = --for k,v in pairs(citelegal_basic_args) do whitelist.basic_arguments[k] = v end; for k,v in pairs(citelegal_basic_args) do whitelist.unique_arguments.legal[k] = v end;table.insert (whitelist.unique_param_template_list, legal); -- not doing this anymore (and doesn't work on read-only)--modifying CS1/Configuration here so we don't have too many mirrors; We had to whitelist aliases first local citelegal_aliases = -- for k,v in pairs(citelegal_aliases) do cfg.aliases[k] = v end; -- not doing this anymore (and doesn't work on read-only)--
--------------------< P A G E S C O P E V A R I A B L E S >---------------
declare variables here that have page-wide scope that are not brought in fromother modules; that are created here and used here
local added_deprecated_cat; -- Boolean flag so that the category is added only oncelocal added_vanc_errs; -- Boolean flag so we only emit one Vancouver error / categorylocal added_generic_name_errs; -- Boolean flag so we only emit one generic name error / category and stop testing names once an error is encounteredlocal Frame; -- holds the module's frame tablelocal is_preview_mode; -- true when article is in preview mode; false when using 'Preview page with this template' (previewing the module)local is_sandbox; -- true when using sandbox modules to render citation
----------------------------< F I R S T _ S E T >------------------------------------------------------------
Locates and returns the first set value in a table of values where the order established in the table,left-to-right (or top-to-bottom), is the order in which the values are evaluated. Returns nil if none are set.
This version replaces the original 'for _, val in pairs do' and a similar version that used ipairs. With the pairsversion the order of evaluation could not be guaranteed. With the ipairs version, a nil value would terminatethe for-loop before it reached the actual end of the list.
local function first_set (list, count) local i = 1; while i <= count do -- loop through all items in list if utilities.is_set(list[i]) then return list[i]; -- return the first set list member end i = i + 1; -- point to next endend
----------------------------< A D D _ V A N C _ E R R O R >----------------------------------------------------
Adds a single Vancouver system error message to the template's output regardless of how many error actually exist.To prevent duplication, added_vanc_errs is nil until an error message is emitted.
added_vanc_errs is a Boolean declared in page scope variables above
local function add_vanc_error (source, position) if added_vanc_errs then return end added_vanc_errs = true; -- note that we've added this category utilities.set_message ('err_vancouver',);end
--
local function is_scheme (scheme) return scheme and scheme:match ('^%a[%a%d%+%.%-]*:'); -- true if scheme is set and matches the patternend
--[=[-------------------------< I S _ D O M A I N _ N A M E >-------------------------------------------------- Does this thing that purports to be a domain name seem to be a valid domain name? Syntax defined here: http://tools.ietf.org/html/rfc1034#section-3.5 BNF defined here: https://tools.ietf.org/html/rfc4234 Single character names are generally reserved; see https://tools.ietf.org/html/draft-ietf-dnsind-iana-dns-01#page-15; see also [[Single-letter second-level domain]]list of TLDs: https://www.iana.org/domains/root/db
RFC 952 (modified by RFC 1123) requires the first and last character of a hostname to be a letter or a digit. Betweenthe first and last characters the name may use letters, digits, and the hyphen.
Also allowed are IPv4 addresses. IPv6 not supported
domain is expected to be stripped of any path so that the last character in the last character of the TLD. tldis two or more alpha characters. Any preceding '//' (from splitting a URL with a scheme) will be strippedhere. Perhaps not necessary but retained in case it is necessary for IPv4 dot decimal.
There are several tests: the first character of the whole domain name including subdomains must be a letter or a digit internationalized domain name (ASCII characters with .xn-- ASCII Compatible Encoding (ACE) prefix xn-- in the TLD) see https://tools.ietf.org/html/rfc3490 single-letter/digit second-level domains in the .org, .cash, and .today TLDs q, x, and z SL domains in the .com TLD i and q SL domains in the .net TLD single-letter SL domains in the ccTLDs (where the ccTLD is two letters) two-character SL domains in gTLDs (where the gTLD is two or more letters) three-plus-character SL domains in gTLDs (where the gTLD is two or more letters) IPv4 dot-decimal address format; TLD not allowed
returns true if domain appears to be a proper name and TLD or IPv4 address, else false
]=]
local function is_domain_name (domain) if not domain then return false; -- if not set, abandon end domain = domain:gsub ('^//', ); -- strip '//' from domain name if present; done here so we only have to do it once if not domain:match ('^[%w]') then -- first character must be letter or digit return false; end
if domain:match ('^%a+:') then -- hack to detect things that look like s:Page:Title where Page: is namespace at Wikisource return false; end
local patterns =
for _, pattern in ipairs (patterns) do -- loop through the patterns list if domain:match (pattern) then return true; -- if a match then we think that this thing that purports to be a URL is a URL end end
for _, d in ipairs do -- look for single letter second level domain names for these top level domains if domain:match ('%f[%w][%w]%.' .. d) then return true end end return false; -- no matches, we don't know what this thing isend
----------------------------< I S _ U R L >------------------------------------------------------------------
returns true if the scheme and domain parts of a URL appear to be a valid URL; else false.
This function is the last step in the validation process. This function is separate because there are cases thatare not covered by split_url, for example is_parameter_ext_wikilink which is looking for bracketted externalwikilinks.
local function is_url (scheme, domain) if utilities.is_set (scheme) then -- if scheme is set check it and domain return is_scheme (scheme) and is_domain_name (domain); else return is_domain_name (domain); -- scheme not set when URL is protocol-relative endend
--
local function split_url (url_str) local scheme, authority, domain; url_str = url_str:gsub ('([%a%d])%.?[/%?#].*$', '%1'); -- strip FQDN terminator and path(/), query(?), fragment (#) (the capture prevents false replacement of '//')
if url_str:match ('^//%S*') then -- if there is what appears to be a protocol-relative URL domain = url_str:match ('^//(%S*)') elseif url_str:match ('%S-:/*%S+') then -- if there is what appears to be a scheme, optional authority indicator, and domain name scheme, authority, domain = url_str:match ('(%S-:)(/*)(%S+)'); -- extract the scheme, authority indicator, and domain portions if utilities.is_set (authority) then authority = authority:gsub ('//', , 1); -- replace place 1 pair of '/' with nothing; if utilities.is_set(authority) then -- if anything left (1 or 3+ '/' where authority should be) then return scheme; -- return scheme only making domain nil which will cause an error message end else if not scheme:match ('^news:') then -- except for news:..., MediaWiki won't link URLs that do not have authority indicator; TODO: a better way to do this test? return scheme; -- return scheme only making domain nil which will cause an error message end end domain = domain:gsub ('(%a):%d+', '%1'); -- strip port number if present end return scheme, domain;end
--title-link=, |series-link=, |author-link=, etc. for properly formatted content: no wikilinks, no URLs
Link parameters are to hold the title of a Wikipedia article, so none of the WP:TITLESPECIALCHARACTERS are allowed: # < > [] | _except the underscore which is used as a space in wiki URLs and # which is used for section links
returns false when the value contains any of these characters.
When there are no illegal characters, this function returns TRUE if value DOES NOT appear to be a valid URL (the|-link= parameter is ok); else false when value appears to be a valid URL (the |-link= parameter is NOT ok).
local function link_param_ok (value) local scheme, domain; if value:find ('[<>%[%]|]') then -- if any prohibited characters return false; end
scheme, domain = split_url (value); -- get scheme or nil and domain or nil from URL; return not is_url (scheme, domain); -- return true if value DOES NOT appear to be a valid URLend
---link= value and its matching |
|
check for inter-language interwiki-link prefix. prefix must be a MediaWiki-recognized languagecode and must begin with a colon.
local function link_title_ok (link, lorig, title, torig)local orig; if utilities.is_set (link) then -- don't bother if -link doesn't have a value if not link_param_ok (link) then -- check |-link= markup orig = lorig; -- identify the failing link parameter elseif title:find ('%[%[') then -- check |title= for wikilink markup orig = torig; -- identify the failing |title= parameter elseif link:match ('^%a+:') then -- if the link is what looks like an interwiki local prefix = link:match ('^(%a+):'):lower; -- get the interwiki prefix if cfg.inter_wiki_map[prefix] then -- if prefix is in the map, must have preceding colon orig = lorig; -- flag as error end end end
if utilities.is_set (orig) then link = ; -- unset utilities.set_message ('err_bad_paramlink', orig); -- URL or wikilink in |title= with |title-link=; end return link; -- link if ok, empty string elseend
--
local function check_url(url_str) if nil
scheme, domain = split_url (url_str); -- get scheme or nil and domain or nil from URL; if 'news:'
--[=[-------------------------< I S _ P A R A M E T E R _ E X T _ W I K I L I N K >---------------------------- Return true if a parameter value has a string that begins and ends with square brackets [ and ] and the firstnon-space characters following the opening bracket appear to be a URL. The test will also find external wikilinksthat use protocol-relative URLs. Also finds bare URLs.
The frontier pattern prevents a match on interwiki-links which are similar to scheme:path URLs. The tests thatfind bracketed URLs are required because the parameters that call this test (currently |title=, |chapter=, |work=,and |publisher=) may have wikilinks and there are articles or redirects like '//Hus' so, while uncommon, |title=//Husis possible as might be .
]=]
local function is_parameter_ext_wikilink (value)local scheme, domain;
if value:match ('%f[%[]%[%a%S*:%S+.*%]') then -- if ext. wikilink with scheme and domain: [xxxx://yyyyy.zzz] scheme, domain = split_url (value:match ('%f[%[]%[(%a%S*:%S+).*%]')); elseif value:match ('%f[%[]%[//%S+.*%]') then -- if protocol-relative ext. wikilink: [//yyyyy.zzz] scheme, domain = split_url (value:match ('%f[%[]%[(//%S+).*%]')); elseif value:match ('%a%S*:%S+') then -- if bare URL with scheme; may have leading or trailing plain text scheme, domain = split_url (value:match ('(%a%S*:%S+)')); elseif value:match ('//%S+') then -- if protocol-relative bare URL: //yyyyy.zzz; may have leading or trailing plain text scheme, domain = split_url (value:match ('(//%S+)')); -- what is left should be the domain else return false; -- didn't find anything that is obviously a URL end
return is_url (scheme, domain); -- return true if value appears to be a valid URLend
---------------------------< C H E C K _ F O R _ U R L >-----------------------------------------------------
loop through a list of parameters and their values. Look at the value and if it has an external link, emit an error message.
local function check_for_url (parameter_list, error_list) for k, v in pairs (parameter_list) do -- for each parameter in the list if is_parameter_ext_wikilink (v) then -- look at the value; if there is a URL add an error message table.insert (error_list, utilities.wrap_style ('parameter', k)); end endend
----------------------------< S A F E _ F O R _ U R L >------------------------------------------------------
Escape sequences for content that will be used for URL descriptions
local function safe_for_url(str) if str:match("%[%[.-%]%]") ~= nil then utilities.set_message ('err_wikilink_in_url',); end return str:gsub('[%[%]\n]',);end
----------------------------< E X T E R N A L _ L I N K >----------------------------------------------------
Format an external link with error checking
local function external_link (URL, label, source, access) local err_msg = ; local domain; local path; local base_url;
if not utilities.is_set (label) then label = URL; if utilities.is_set (source) then utilities.set_message ('err_bare_url_missing_title',); else error (cfg.messages["bare_url_no_origin"]); end end if not check_url (URL) then utilities.set_message ('err_bad_url',); end domain, path = URL:match ('^([/%.%-%+:%a%d]+)([/%?#].*)$'); -- split the URL into scheme plus domain and path if path then -- if there is a path portion path = path:gsub ('[%[%]]',); -- replace '[' and ']' with their percent-encoded values URL = table.concat ; -- and reassemble end
base_url = table.concat ; -- assemble a wiki-markup URL
if utilities.is_set (access) then -- access level (subscription, registration, limited) base_url = utilities.substitute (cfg.presentation['ext-link-access-signal'],); -- add the appropriate icon end
return base_url;end
----------------------------< D E P R E C A T E D _ P A R A M E T E R >--------------------------------------
Categorize and emit an error message when the citation contains one or more deprecated parameters. The function includes theoffending parameter name to the error message. Only one error message is emitted regardless of the number of deprecatedparameters in the citation.
added_deprecated_cat is a Boolean declared in page scope variables above
local function deprecated_parameter(name) if not added_deprecated_cat then added_deprecated_cat = true; -- note that we've added this category utilities.set_message ('err_deprecated_params',); -- add error message endend
--[=[-------------------------< K E R N _ Q U O T E S >-------------------------------------------------------- Apply kerning to open the space between the quote mark provided by the module and a leading or trailing quote mark contained in a |title= or |chapter= parameter's value. This function will positive kern either single or double quotes: "'Unkerned title with leading and trailing single quote marks'" " 'Kerned title with leading and trailing single quote marks' " (in real life the kerning isn't as wide as this example) Double single quotes (italic or bold wiki-markup) are not kerned. Replaces Unicode quote marks in plain text or in the label portion of a [[L|D]] style wikilink with typewriterquote marks regardless of the need for kerning. Unicode quote marks are not replaced in simple D wikilinks.
Call this function for chapter titles, for website titles, etc.; not for book titles.
]=]
local function kern_quotes (str) local cap = ; local wl_type, label, link;
wl_type, label, link = utilities.is_wikilink (str); -- wl_type is: 0, no wl (text in label variable); 1, D; 2, D if 1
else -- plain text or D; text in label variable label = mw.ustring.gsub (label, '[“”]', '\"'); -- replace “” (U+201C & U+201D) with " (typewriter double quote mark) label = mw.ustring.gsub (label, '[‘’]', '\); -- replace ‘’ (U+2018 & U+2019) with ' (typewriter single quote mark)
cap = mw.ustring.match (label, "^([\"\'][^\'].+)"); -- match leading double or single quote but not doubled single quotes (italic markup) if utilities.is_set (cap) then label = utilities.substitute (cfg.presentation['kern-left'], cap); end cap = mw.ustring.match (label, "^(.+[^\'][\"\'])$") -- match trailing double or single quote but not doubled single quotes (italic markup) if utilities.is_set (cap) then label = utilities.substitute (cfg.presentation['kern-right'], cap); end if 2
--script-title= holds title parameters that are not written in Latin-based scripts: Chinese, Japanese, Arabic, Hebrew, etc. These scripts shouldnot be italicized and may be written right-to-left. The value supplied by |script-title= is concatenated onto Title after Title has been wrappedin italic markup.
Regardless of language, all values provided by |script-title= are wrapped in ... tags to isolate RTL languages from the English left to right.
|script-title= provides a unique feature. The value in |script-title= may be prefixed with a two-character ISO 639-1 language code and a colon: |script-title=ja:*** *** (where * represents a Japanese character)Spaces between the two-character code and the colon and the colon and the first script character are allowed: |script-title=ja : *** *** |script-title=ja: *** *** |script-title=ja :*** ***Spaces preceding the prefix are allowed: |script-title = ja:*** ***
The prefix is checked for validity. If it is a valid ISO 639-1 language code, the lang attribute (lang="ja") is added to the tag so that browsers canknow the language the tag contains. This may help the browser render the script more correctly. If the prefix is invalid, the lang attributeis not added. At this time there is no error message for this condition.
Supports |script-title=, |script-chapter=, |script-
local function format_script_value (script_value, script_param) local lang=; -- initialize to empty string local name; if script_value:match('^%l%l%l?%s*:') then -- if first 3 or 4 non-space characters are script language prefix lang = script_value:match('^(%l%l%l?)%s*:%s*%S.*'); -- get the language prefix or nil if there is no script if not utilities.is_set (lang) then utilities.set_message ('err_script_parameter',); -- prefix without 'title'; add error message return ; -- script_value was just the prefix so return empty string end -- if we get this far we have prefix and script name = cfg.lang_code_remap[lang] or mw.language.fetchLanguageName(lang, cfg.this_wiki_code); -- get language name so that we can use it to categorize if utilities.is_set (name) then -- is prefix a proper ISO 639-1 language code? script_value = script_value:gsub ('^%l+%s*:%s*', ); -- strip prefix from script -- is prefix one of these language codes? if utilities.in_array (lang, cfg.script_lang_codes) then utilities.add_prop_cat ('script',) else utilities.set_message ('err_script_parameter',); -- unknown script-language; add error message end lang = ' lang="' .. lang .. '" '; -- convert prefix into a lang attribute else utilities.set_message ('err_script_parameter',); -- invalid language code; add error message lang = ; -- invalid so set lang to empty string end else utilities.set_message ('err_script_parameter',); -- no language code prefix; add error message end script_value = utilities.substitute (cfg.presentation['bdi'], ); -- isolate in case script is RTL
return script_value;end
--title= and |script-title=, this function concatenates those two parameter values after the scriptvalue has been wrapped in tags.
local function script_concatenate (title, script, script_param) if utilities.is_set (script) then script = format_script_value (script, script_param); -- tags, lang attribute, categorization, etc.; returns empty string on error if utilities.is_set (script) then title = title .. ' ' .. script; -- concatenate title and script title end end return title;end
----------------------------< W R A P _ M S G >--------------------------------------------------------------
Applies additional message text to various parameter values. Supplied string is wrapped using a message_listconfiguration taking one argument. Supports lower case text for templates. Additional text takenfrom citation_config.messages - the reason this function is similar to but separate from wrap_style.
local function wrap_msg (key, str, lower) if not utilities.is_set (str) then return ""; end if true
--chapter= (or aliases) or |title= or |title-link=
local function wikisource_url_make (str) local wl_type, D, L; local ws_url, ws_label; local wikisource_prefix = table.concat ;
wl_type, D, L = utilities.is_wikilink (str); -- wl_type is 0 (not a wikilink), 1 (simple wikilink), 2 (complex wikilink)
if 0
wl_type then -- simple wikilink: str = D:match ('^[Ww]ikisource:(.+)') or D:match ('^[Ss]:(.+)'); -- article title from interwiki link with long-form or short-form namespace if utilities.is_set (str) then ws_url = table.concat ; ws_label = str; -- label for the URL end elseif 2
if ws_url then ws_url = mw.uri.encode (ws_url, 'WIKI'); -- make a usable URL ws_url = ws_url:gsub ('%%23', '#'); -- undo percent-encoding of fragment marker end
return ws_url, ws_label, L or D; -- return proper URL or nil and a label or nilend
--script-
local function format_periodical (script_periodical, script_periodical_source, periodical, trans_periodical)
if not utilities.is_set (periodical) then periodical = ; -- to be safe for concatenation else periodical = utilities.wrap_style ('italic-title', periodical); -- style end
periodical = script_concatenate (periodical, script_periodical, script_periodical_source); -- tags, lang attribute, categorization, etc.; must be done after title is wrapped
if utilities.is_set (trans_periodical) then trans_periodical = utilities.wrap_style ('trans-italic-title', trans_periodical); if utilities.is_set (periodical) then periodical = periodical .. ' ' .. trans_periodical; else -- here when trans-periodical without periodical or script-periodical periodical = trans_periodical; utilities.set_message ('err_trans_missing_title',); end end
return periodical;end
---- Code copied from format_periodical above-- We may need our own function if we want to have periodical linkable laterlocal function format_plain_periodical (script_periodical, script_periodical_source, periodical, trans_periodical)
if not utilities.is_set (periodical) then periodical = ; -- to be safe for concatenation end
periodical = script_concatenate (periodical, script_periodical, script_periodical_source); -- tags, lang attribute, categorization, etc.; must be done after title is wrapped
if utilities.is_set (trans_periodical) then trans_periodical = utilities.wrap_style ('trans-quoted-title', trans_periodical); if utilities.is_set (periodical) then periodical = periodical .. ' ' .. trans_periodical; else -- here when trans-periodical without periodical or script-periodical periodical = trans_periodical; utilities.set_message ('err_trans_missing_title',); end end
return periodical;end--
--script-chapter=, |chapter=, |trans-chapter=,and |chapter-url= into a single chapter meta- parameter (chapter_url_source usedfor error messages).
local function format_chapter_title (script_chapter, script_chapter_source, chapter, chapter_source, trans_chapter, trans_chapter_source, chapter_url, chapter_url_source, no_quotes, access) local ws_url, ws_label, L = wikisource_url_make (chapter); -- make a wikisource URL and label from a wikisource interwiki link if ws_url then ws_label = ws_label:gsub ('_', ' '); -- replace underscore separators with space characters chapter = ws_label; end
if not utilities.is_set (chapter) then chapter = ; -- to be safe for concatenation else if false
chapter = script_concatenate (chapter, script_chapter, script_chapter_source); -- tags, lang attribute, categorization, etc.; must be done after title is wrapped
if utilities.is_set (chapter_url) then chapter = external_link (chapter_url, chapter, chapter_url_source, access); -- adds bare_url_missing_title error if appropriate elseif ws_url then chapter = external_link (ws_url, chapter .. ' ', 'ws link in chapter'); -- adds bare_url_missing_title error if appropriate; space char to move icon away from chap text; TODO: better way to do this? chapter = utilities.substitute (cfg.presentation['interwiki-icon'],); end
if utilities.is_set (trans_chapter) then trans_chapter = utilities.wrap_style ('trans-quoted-title', trans_chapter); if utilities.is_set (chapter) then chapter = chapter .. ' ' .. trans_chapter; else -- here when trans_chapter without chapter or script-chapter chapter = trans_chapter; chapter_source = trans_chapter_source:match ('trans%-?(.+)'); -- when no chapter, get matching name from trans- utilities.set_message ('err_trans_missing_title',); end end
return chapter;end
------------------< H A S _ I N V I S I B L E _ C H A R S >-------------------
This function searches a parameter's value for non-printable or invisible characters.The search stops at the first match.
This function will detect the visible replacement character when it is part of the Wikisource.
Detects but ignores nowiki and math stripmarkers. Also detects other named stripmarkers(gallery, math, pre, ref) and identifies them with a slightly different error message.See also coins_cleanup.
Output of this function is an error message that identifies the character or theUnicode group, or the stripmarker that was detected along with its position (or,for multi-byte characters, the position of its first byte) in the parameter value.
local function has_invisible_chars (param, v) local position = ; -- position of invisible char or starting position of stripmarker local capture; -- used by stripmarker detection to hold name of the stripmarker local stripmarker; -- boolean set true when a stripmarker is found
capture = string.match (v, '[%w%p ]*'); -- test for values that are simple ASCII text and bypass other tests if true if capture
for _, invisible_char in ipairs (cfg.invisible_chars) do local char_name = invisible_char[1]; -- the character or group name local pattern = invisible_char[2]; -- the pattern used to find it position, _, capture = mw.ustring.find (v, pattern); -- see if the parameter value contains characters that match the pattern if position and (cfg.invisible_defs.zwj
capture or 'math'
capture and utilities.in_array (param,)) then -- templatestyles stripmarker allowed in these parameters stripmarker = true; -- set a flag elseif true
capture then -- because stripmakers begin and end with the delete char, assume that we've found one end of a stripmarker position = nil; -- unset else local err_msg; if capture and not (cfg.invisible_defs.del
capture) then err_msg = capture .. ' ' .. char_name; else err_msg = char_name .. ' ' .. 'character'; end
utilities.set_message ('err_invisible_char',); -- add error message return; -- and done with this parameter end end endend
---------------------< A R G U M E N T _ W R A P P E R >----------------------
Argument wrapper. This function provides support for argument mapping definedin the configuration file so that multiple names can be transparently aliased tosingle internal variable.
local function argument_wrapper (args) local origin = ; return setmetatable;end
--
local function nowrap_date (date) local cap = ; local cap2 = ;
if date:match("^%d%d%d%d%-%d%d%-%d%d$") then date = utilities.substitute (cfg.presentation['nowrap1'], date); elseif date:match("^%a+%s*%d%d?,%s+%d%d%d%d$") or date:match ("^%d%d?%s*%a+%s+%d%d%d%d$") then cap, cap2 = string.match (date, "^(.*)%s+(%d%d%d%d)$"); date = utilities.substitute (cfg.presentation['nowrap2'],); end return date;end
--type=
local function set_titletype (cite_class, title_type) if utilities.is_set (title_type) then if 'none'
return cfg.title_types [cite_class] or ; -- set template's default title type; else empty string for concatenationend
----------------------------< S A F E _ J O I N >-----------------------------
Joins a sequence of strings together while checking for duplicate separation characters.
local function safe_join(tbl, duplicate_char) local f = ; -- create a function table appropriate to type of 'duplicate character' if 1
local str = ; -- the output string local comp = ; -- what does 'comp' mean? local end_chr = ; local trim; for _, value in ipairs(tbl) do if value
then -- if output string is empty str = value; -- assign value to it (first time through the loop) elseif value ~= then if value:sub(1, 1)
duplicate_char then -- if same as separator str = f.sub(str, 1, -2); -- remove it elseif end_chr
duplicate_char .. "" then -- if last three chars of str are sepc str = f.sub(str, 1, -4) .. ""; -- remove them and add back elseif f.sub(str, -5, -1)
duplicate_char .. "]" then -- if last four chars of str are sepc] trim = true; -- same question end elseif end_chr
duplicate_char .. "]]" then -- if last three chars of str are sepc]] wikilink trim = true; elseif f.sub(str, -3, -1)
duplicate_char .. "]" then -- if last two chars of str are sepc] external link trim = true; elseif f.sub(str, -4, -1)
" " then -- if last char of output string is a space if f.sub(str, -2, -1)
if trim then if value ~= comp then -- value does not equal comp when value contains HTML markup local dup2 = duplicate_char; if f.match(dup2, "%A") then dup2 = "%" .. dup2; end -- if duplicate_char not a letter then escape it value = f.gsub(value, "(%b<>)" .. dup2, "%1", 1) -- remove duplicate_char if it follows HTML markup else value = f.sub(value, 2, -1); -- remove duplicate_char when it is first character end end end str = str .. value; -- add it to the output string end end return str;end
----------------------------< I S _ S U F F I X >-----------------------------
returns true if suffix is properly formed Jr, Sr, or ordinal in the range 1–9.Puncutation not allowed.
local function is_suffix (suffix) if utilities.in_array (suffix,) or suffix:match ('^%dth$') then return true; end return false;end
--first= and |last= names to contain any of the letters definedin the four Unicode Latin character sets C0 Controls and Basic Latin 0041–005A, 0061–007A C1 Controls and Latin-1 Supplement 00C0–00D6, 00D8–00F6, 00F8–00FF Latin Extended-A 0100–017F Latin Extended-B 0180–01BF, 01C4–024F
|lastn= also allowed to contain hyphens, spaces, and apostrophes. (http://www.ncbi.nlm.nih.gov/books/NBK7271/box/A35029/)|firstn= also allowed to contain hyphens, spaces, apostrophes, and periods
This original test: if nil
mw.ustring.find (first, "^[A-Za-zÀ-ÖØ-öø-ƿDŽ-ɏ%-%s%'%.]+[2-6%a]*$") thenwas written outside of the code editor and pasted here because the code editorgets confused between character insertion point and cursor position. The test hasbeen rewritten to use decimal character escape sequence for the individual bytesof the Unicode characters so that it is not necessary to use an external editorto maintain this code.
\195\128-\195\150 – À-Ö (U+00C0–U+00D6 – C0 controls) \195\152-\195\182 – Ø-ö (U+00D8-U+00F6 – C0 controls) \195\184-\198\191 – ø-ƿ (U+00F8-U+01BF – C0 controls, Latin extended A & B) \199\132-\201\143 – DŽ-ɏ (U+01C4-U+024F – Latin extended B)
local function is_good_vanc_name (last, first, suffix, position) if not suffix then if first:find ('[,%s]') then -- when there is a space or comma, might be first name/initials + generational suffix first = first:match ('(.-)[,%s]+'); -- get name/initials suffix = first:match ('[,%s]+(.+)$'); -- get generational suffix end end if utilities.is_set (suffix) then if not is_suffix (suffix) then add_vanc_error (cfg.err_msg_supl.suffix, position); return false; -- not a name with an appropriate suffix end end if nil
mw.ustring.find (first, "^[A-Za-z\195\128-\195\150\195\152-\195\182\195\184-\198\191\199\132-\201\143%-%s%'%.]*$") then add_vanc_error (cfg.err_msg_supl['non-Latin char'], position); return false; -- not a string of Latin characters; Vancouver requires Romanization end; return true;end
--name-list-style=vanc.
Names in |firstn= may be separated by spaces or hyphens, or for initials, a period.See http://www.ncbi.nlm.nih.gov/books/NBK7271/box/A35062/.
Vancouver style requires family rank designations (Jr, II, III, etc.) to be renderedas Jr, 2nd, 3rd, etc. See http://www.ncbi.nlm.nih.gov/books/NBK7271/box/A35085/.This code only accepts and understands generational suffix in the Vancouver formatbecause Roman numerals look like, and can be mistaken for, initials.
This function uses ustring functions because firstname initials may be any of theUnicode Latin characters accepted by is_good_vanc_name .
local function reduce_to_initials(first, position) local name, suffix = mw.ustring.match(first, "^(%u+) ([%dJS][%drndth]+)$");
if not name then -- if not initials and a suffix name = mw.ustring.match(first, "^(%u+)$"); -- is it just initials? end
if name then -- if first is initials with or without suffix if 3 > mw.ustring.len (name) then -- if one or two initials if suffix then -- if there is a suffix if is_suffix (suffix) then -- is it legitimate? return first; -- one or two initials and a valid suffix so nothing to do else add_vanc_error (cfg.err_msg_supl.suffix, position); -- one or two initials with invalid suffix so error message return first; -- and return first unmolested end else return first; -- one or two initials without suffix; nothing to do end end end -- if here then name has 3 or more uppercase letters so treat them as a word
local initials, names =, ; -- tables to hold name parts and initials local i = 1; -- counter for number of initials
names = mw.text.split (first, '[%s,]+'); -- split into a table of names and possible suffix
while names[i] do -- loop through the table if 1 < i and names[i]:match ('[%dJS][%drndth]+%.?$') then -- if not the first name, and looks like a suffix (may have trailing dot) names[i] = names[i]:gsub ('%.', ); -- remove terminal dot if present if is_suffix (names[i]) then -- if a legitimate suffix table.insert (initials, ' ' .. names[i]); -- add a separator space, insert at end of initials table break; -- and done because suffix must fall at the end of a name end -- no error message if not a suffix; possibly because of Romanization end if 3 > i then table.insert (initials, mw.ustring.sub(names[i], 1, 1)); -- insert the initial at end of initials table end i = i + 1; -- bump the counter end return table.concat(initials) -- Vancouver format does not include spaces.end
-- when |
local function list_people (control, people, etal) local sep; local namesep; local format = control.format; local maximum = control.maximum; local name_list = ;
if 'vanc'
if utilities.is_set (maximum) and i > maximum then etal = true; break; end if mask then local n = tonumber (mask); -- convert to a number if it can be converted; nil else if n then one = 0 ~= n and string.rep(" - ", n) or nil; -- make a string of (n > 0) mdashes, nil else, to replace name person.link = nil; -- don't create link to name if name is replaces with mdash string or has been set nil else one = mask; -- replace name with mask text (must include name-list separator) sep_one = " "; -- modify name-list separator end else one = person.last; -- get surname local first = person.first -- get given name if utilities.is_set (first) then if ("vanc"
local count = #name_list / 2; -- (number of names + number of separators) divided by 2 if 0 < count then if 1 < count and not etal then if 'amp'
format then if 2
local result = table.concat (name_list); -- construct list if etal and utilities.is_set (result) then -- etal may be set by |display-authors=etal but we might not have a last-first list result = result .. sep .. ' ' .. cfg.messages['et al']; -- we've got a last-first list and etal so add et al. end return result, count; -- return name-list string and count of number of names (count used for editor names only)end
----------------------< M A K E _ C I T E R E F _ I D >-----------------------
Generates a CITEREF anchor ID if we have at least one name or a date. Otherwisereturns an empty string.
namelist is one of the contributor-, author-, or editor-name lists chosen in thatorder. year is Year or anchor_year.
local function make_citeref_id (namelist, year) local names=; -- a table for the one to four names and year for i,v in ipairs (namelist) do -- loop through the list and take up to the first four last names names[i] = v.last if i
--mode= parameter
local function cite_class_attribute_make (cite_class, mode) local class_t = ; table.insert (class_t, 'citation'); -- required for blue highlight if 'citation' ~= cite_class then table.insert (class_t, cite_class); -- identify this template for user css table.insert (class_t, utilities.is_set (mode) and mode or 'cs1'); -- identify the citation style for user css or javascript else table.insert (class_t, utilities.is_set (mode) and mode or 'cs2'); -- identify the citation style for user css or javascript end for _, prop_key in ipairs (z.prop_keys_t) do table.insert (class_t, prop_key); -- identify various properties for user css or javascript end
return table.concat (class_t, ' '); -- make a big string and doneend
--display-
local function name_has_etal (name, etal, nocat, param)
if utilities.is_set (name) then -- name can be nil in which case just return local patterns = cfg.et_al_patterns; -- get patterns from configuration for _, pattern in ipairs (patterns) do -- loop through all of the patterns if name:match (pattern) then -- if this 'et al' pattern is found in name name = name:gsub (pattern, ); -- remove the offending text etal = true; -- set flag (may have been set previously here or by |display-
return name, etal;end
--last=A. Green (1922-1987) does not get caught in thecurrent version of this test but |first=(1888) is caught.
returns nothing
local function name_is_numeric (name, list_name) if utilities.is_set (name) then if mw.ustring.match (name, '^[%A]+$') then -- when name does not contain any letters utilities.set_message ('maint_numeric_names', cfg.special_case_translation [list_name]); -- add a maint cat for this template end endend
-------------------< N A M E _ H A S _ M U L T _ N A M E S >------------------
Evaluates the content of last/surname (authors etc.) parameters for multiple names.Multiple names are indicated if there is more than one comma or any "unescaped"semicolons. Escaped semicolons are ones used as part of selected HTML entities.If the condition is met, the function adds the multiple name maintenance category.
returns nothing
local function name_has_mult_names (name, list_name) local _, commas, semicolons, nbsps; if utilities.is_set (name) then _, commas = name:gsub (',', ); -- count the number of commas _, semicolons = name:gsub (';', ); -- count the number of semicolons -- nbsps probably should be its own separate count rather than merged in -- some way with semicolons because Lua patterns do not support the -- grouping operator that regex does, which means there is no way to add -- more entities to escape except by adding more counts with the new -- entities _, nbsps = name:gsub (' ',); -- count nbsps -- There is exactly 1 semicolon per entity, so subtract nbsps -- from semicolons to 'escape' them. If additional entities are added, -- they also can be subtracted. if 1 < commas or 0 < (semicolons - nbsps) then utilities.set_message ('maint_mult_names', cfg.special_case_translation [list_name]); -- add a maint message end endend
--[=[-------------------------< I S _ G E N E R I C >---------------------------------------------------------- Compares values assigned to various parameters according to the string provided as <item> in the function call. <item> can have on of two values: 'generic_names' – for name-holding parameters: |last=, |first=, |editor-last=, etc 'generic_titles' – for |title= There are two types of generic tests. The 'accept' tests look for a pattern that should not be rejected by the 'reject' test. For example, |author=[[John Smith (author)|Smith, John]]would be rejected by the 'author' reject test. But piped wikilinks with 'author' disambiguation should not berejected so the 'accept' test prevents that from happening. Accept tests are always performed before rejecttests.
Each of the 'accept' and 'reject' sequence tables hold tables for en.wiki (['en']) and local.wiki (['local'])that each can hold a test sequence table The sequence table holds, at index [1], a test pattern, and, at index[2], a boolean control value. The control value tells string.find or mw.ustring.find to do plain-text search (true)or a pattern search (false). The intent of all this complexity is to make these searches as fast as possible sothat we don't run out of processing time on very large articles.
Returns true when a reject test finds the pattern or string false when an accept test finds the pattern or string nil else
]=]
local function is_generic (item, value, wiki) local test_val; local str_lower = local str_find =
local function test (val, test_t, wiki) -- local function to do the testing;
for _, test_type in ipairs (test_types_t) do -- for each test type for _, generic_value in pairs (cfg.special_case_translation[item][test_type]) do -- spin through the list of generic value fragments to accept or reject for _, wiki in ipairs (wikis_t) do if generic_value[wiki] then if test (value, generic_value[wiki], wiki) then -- go do the test return ('reject'
----------------------------< N A M E _ I S _ G E N E R I C >------------------------------------------------
calls is_generic to determine if
local function name_is_generic (name, name_alias) if not added_generic_name_errs and is_generic ('generic_names', name) then utilities.set_message ('err_generic_name', name_alias); -- set an error message added_generic_name_errs = true; endend
----------------------------< N A M E _ C H E C K S >--------------------------------------------------------
This function calls various name checking functions used to validate the content of the various name-holding parameters.
local function name_checks (last, first, list_name, last_alias, first_alias) local accept_name;
if utilities.is_set (last) then last, accept_name = utilities.has_accept_as_written (last); -- remove accept-this-as-written markup when it wraps all of
if not accept_name then --
if utilities.is_set (first) then first, accept_name = utilities.has_accept_as_written (first); -- remove accept-this-as-written markup when it wraps all of
if not accept_name then --
return last, first; -- doneend
--lastn= and |firstn= parameters(or their aliases), and their matching link and mask parameters. Stops searchingwhen both |lastn= and |firstn= are not found in args after two sequential attempts:found |last1=, |last2=, and |last3= but doesn't find |last4= and |last5= then thesearch is done.
This function emits an error message when there is a |firstn= without a matching|lastn=. When there are 'holes' in the list of last names, |last1= and |last3=are present but |last2= is missing, an error message is emitted. |lastn= is notrequired to have a matching |firstn=.
When an author or editor parameter contains some form of 'et al.', the 'et al.'is stripped from the parameter and a flag (etal) returned that will cause list_peopleto add the static 'et al.' text from Module:Citation/CS1/Configuration. This keeps'et al.' out of the template's metadata. When this occurs, an error is emitted.
local function extract_names(args, list_name) local names = ; -- table of names local last; -- individual name components local first; local link; local mask; local i = 1; -- loop counter/indexer local n = 1; -- output table indexer local count = 0; -- used to count the number of times we haven't found a |last= (or alias for authors, |editor-last or alias for editors) local etal = false; -- return value set to true when we find some form of et al. in an author parameter
local last_alias, first_alias, link_alias; -- selected parameter aliases used in error messaging while true do last, last_alias = utilities.select_one (args, cfg.aliases[list_name .. '-Last'], 'err_redundant_parameters', i); -- search through args for name components beginning at 1 first, first_alias = utilities.select_one (args, cfg.aliases[list_name .. '-First'], 'err_redundant_parameters', i); link, link_alias = utilities.select_one (args, cfg.aliases[list_name .. '-Link'], 'err_redundant_parameters', i); mask = utilities.select_one (args, cfg.aliases[list_name .. '-Mask'], 'err_redundant_parameters', i);
last, etal = name_has_etal (last, etal, false, last_alias); -- find and remove variations on et al. first, etal = name_has_etal (first, etal, false, first_alias); -- find and remove variations on et al. last, first = name_checks (last, first, list_name, last_alias, first_alias); -- multiple names, extraneous annotation, etc. checks
if first and not last then -- if there is a firstn without a matching lastn local alias = first_alias:find ('given', 1, true) and 'given' or 'first'; -- get first or given form of the alias utilities.set_message ('err_first_missing_last',); -- add this error message elseif not first and not last then -- if both firstn and lastn aren't found, are we done? count = count + 1; -- number of times we haven't found last and first if 2 <= count then -- two missing names and we give up break; -- normal exit or there is a two-name hole in the list; can't tell which end else -- we have last with or without a first local result; link = link_title_ok (link, link_alias, last, last_alias); -- check for improper wiki-markup
if first then link = link_title_ok (link, link_alias, first, first_alias); -- check for improper wiki-markup end
names[n] = ; -- add this name to our names list (corporate for |vauthors= only) n = n + 1; -- point to next location in the names table if 1
--language= This function looks for: on success, returns name (in properly capitalized form) and matching tag (in lowercase); on failure returns nil
local function name_tag_get (lang_param) local lang_param_lc = mw.ustring.lower (lang_param); -- use lowercase as an index into the various tables local name; local tag;
name = cfg.lang_code_remap[lang_param_lc]; -- assume
tag = lang_param_lc:match ('^(%a%a%a?)%-.*'); -- still assuming that
if cfg.lang_name_remap[lang_param_lc] then -- not a tag, assume
tag = cfg.mw_languages_by_name_t[lang_param_lc]; -- assume that
name = cfg.mw_languages_by_tag_t[lang_param_lc]; -- assume that
if tag then name = cfg.mw_languages_by_tag_t[tag]; -- attempt to get a language name using the shortened
--language= contains a recognized language (either code or name), the page isassigned to the category for that code: Category:Norwegian-language sources (no).For valid three-character code languages, the page is assigned to the single categoryfor '639-2' codes: Category:CS1 ISO 639-2 language sources.
Languages that are the same as the local wiki are not categorized. MediaWiki doesnot recognize three-character equivalents of two-character codes: code 'ar' isrecognized but code 'ara' is not.
This function supports multiple languages in the form |language=nb, French, thwhere the language names or codes are separated from each other by commas withoptional space characters.
local function language_parameter (lang) local tag; -- some form of IETF-like language tag; language subtag with optional region, sript, vatiant, etc subtags local lang_subtag; -- ve populates |language= with mostly unecessary region subtags the MediaWiki does not recognize; this is the base language subtag local name; -- the language name local language_list = ; -- table of language names to be rendered local names_t = ; -- table made from the value assigned to |language=
local this_wiki_name = mw.language.fetchLanguageName (cfg.this_wiki_code, cfg.this_wiki_code); -- get this wiki's language name
names_t = mw.text.split (lang, '%s*,%s*'); -- names should be a comma separated list
for _, lang in ipairs (names_t) do -- reuse lang here because we don't yet know if lang is a language name or a language tag name, tag = name_tag_get (lang); -- attempt to get name/tag pair for
if utilities.is_set (tag) then lang_subtag = tag:gsub ('^(%a%a%a?)%-.*', '%1'); -- for categorization, strip any IETF-like tags from language tag
if cfg.this_wiki_code ~= lang_subtag then -- when the language is not the same as this wiki's language if 2
#language_list) and (lang_subtag
-------------------------< S E T _ C S _ S T Y L E >--------------------------
Gets the default CS style configuration for the given mode.Returns default separator and either postscript as passed in or the default.In CS1, the default postscript and separator are '.'.In CS2, the default postscript is the empty string and the default separator is ','.
local function set_cs_style (postscript, mode) if utilities.is_set(postscript) then -- emit a maintenance message if user postscript is the default cs1 postscript -- we catch the opposite case for cs2 in set_style if mode
cfg.presentation['ps_' .. mode] then utilities.set_message ('maint_postscript'); end else postscript = cfg.presentation['ps_' .. mode]; end return cfg.presentation['sep_' .. mode], postscript;end
--mode= first and the
local function set_style (mode, postscript, cite_class) local sep; if 'cs2'
mode then sep, postscript = set_cs_style (postscript, 'cs1'); elseif 'citation'
if cfg.keywords_xlate[postscript:lower]
mode or 'citation'
--[=[-------------------------< I S _ P D F >----------------------------------- Determines if a URL has the file extension that is one of the PDF file extensions used by [[MediaWiki:Common.css]] when applying the PDF icon to external links.
returns true if file extension is one of the recognized extensions, else false
]=]
local function is_pdf (url) return url:match ('%.pdf$') or url:match ('%.PDF$') or url:match ('%.pdf[%?#]') or url:match ('%.PDF[%?#]') or url:match ('%.PDF#') or url:match ('%.pdf#');end
--format=, |chapter-format=, etc. Also emits an error messageif the format parameter does not have a matching URL parameter. If the format parameteris not set and the URL contains a file extension that is recognized as a PDF documentby MediaWiki's commons.css, this code will set the format parameter to (PDF) withthe appropriate styling.
local function style_format (format, url, fmt_param, url_param) if utilities.is_set (format) then format = utilities.wrap_style ('format', format); -- add leading space, parentheses, resize if not utilities.is_set (url) then utilities.set_message ('err_format_missing_url',); -- add an error message end elseif is_pdf (url) then -- format is not set so if URL is a PDF file then format = utilities.wrap_style ('format', 'PDF'); -- set format to PDF else format = ; -- empty string for concatenation end return format;end
--display-xxxxors= is a number greater than or equal to zero,return the number and the previous state of the 'etal' flag (false by defaultbut may have been set to true if the name list contains some variant of the text 'et al.').
When the value assigned to |display-xxxxors= is the keyword 'etal', return a numberthat is one greater than the number of authors in the list and set the 'etal' flag true.This will cause the list_people to display all of the names in the name list followed by 'et al.'
In all other cases, returns nil and the previous state of the 'etal' flag.
inputs: max: A['DisplayAuthors'] or A['DisplayEditors']; a number or some flavor of etal count: #a or #e list_name: 'authors' or 'editors' etal: author_etal or editor_etal
local function get_display_names (max, count, list_name, etal, param) if utilities.is_set (max) then if 'etal'
--page=, |pages=, |quote-page=, |quote-pages= has what appears to besome form of p. or pp. abbreviation in the first characters of the parameter content.
check page for extraneous p, p., pp, pp., pg, pg. at start of parameter value: good pattern: '^P[^%.P%l]' matches when page begins PX or P# but not Px where x and X are letters and # is a digit bad pattern: '^[Pp][PpGg]' matches when page begins pp, pP, Pp, PP, pg, pG, Pg, PG
local function extra_text_in_page_check (val, name) if not val:match (cfg.vol_iss_pg_patterns.good_ppattern) then for _, pattern in ipairs (cfg.vol_iss_pg_patterns.bad_ppatterns) do -- spin through the selected sequence table of patterns if val:match (pattern) then -- when a match, error so utilities.set_message ('err_extra_text_pages', name); -- add error message return; -- and done end end end end
--volume= or |issue= has what appears to be some form of redundant 'type' indicator.
For |volume=: 'V.', or 'Vol.' (with or without the dot) abbreviations or 'Volume' in the first characters of the parameter content (all case insensitive). 'V' and 'v' (without the dot) are presumed to be roman numerals so are allowed.
For |issue=: 'No.', 'I.', 'Iss.' (with or without the dot) abbreviations, or 'Issue' in the first characters of the parameter content (all case insensitive). Single character values ('v', 'i', 'n') allowed when not followed by separator character ('.', ':', '=', orwhitespace character) – param values are trimmed of whitespace by MediaWiki before delivered to the module.
sets error message on failure; returns nothing
local function extra_text_in_vol_iss_check (val, name, selector) if not utilities.is_set (val) then return; end local patterns = 'v'
local handler = 'v'
--[=[-------------------------< G E T _ V _ N A M E _ T A B L E >---------------------------------------------- split apart a |vauthors= or |veditors= parameter. This function allows for corporate names, wrapped in doubled parentheses to also have commas; in the old version of the code, the doubled parentheses were included in the rendered citation and in the metadata. Individual author names may be wikilinked |vauthors=Jones AB, [[E. B. White|White EB]], ((Black, Brown, and Co.))
]=]
local function get_v_name_table (vparam, output_table, output_link_table) local name_table = mw.text.split(vparam, "%s*,%s*"); -- names are separated by commas local wl_type, label, link; -- wl_type not used here; just a placeholder local i = 1; while name_table[i] do if name_table[i]:match ('^%(%(.*[^%)][^%)]$') then -- first segment of corporate with one or more commas; this segment has the opening doubled parentheses local name = name_table[i]; i = i + 1; -- bump indexer to next segment while name_table[i] do name = name .. ', ' .. name_table[i]; -- concatenate with previous segments if name_table[i]:match ('^.*%)%)$') then -- if this table member has the closing doubled parentheses break; -- and done reassembling so end i = i + 1; -- bump indexer end table.insert (output_table, name); -- and add corporate name to the output table table.insert (output_link_table, ); -- no wikilink else wl_type, label, link = utilities.is_wikilink (name_table[i]); -- wl_type is: 0, no wl (text in label variable); 1, D; 2, D table.insert (output_table, label); -- add this name if 1
--vauthors= or |veditors= and finds matching |xxxxor-maskn= and|xxxxor-linkn= in args. It then returns a table of assembled names just as extract_names does.
Author / editor names in |vauthors= or |veditors= must be in Vancouver system style. Corporate or institutional namesmay sometimes be required and because such names will often fail the is_good_vanc_name and other format compliancetests, are wrapped in doubled parentheses ((corporate name)) to suppress the format tests.
Supports generational suffixes Jr, 2nd, 3rd, 4th–6th.
This function sets the Vancouver error when a required comma is missing and when there is a space between an author's initials.
local function parse_vauthors_veditors (args, vparam, list_name) local names = ; -- table of names assembled from |vauthors=, |author-maskn=, |author-linkn= local v_name_table = ; local v_link_table = ; -- when name is wikilinked, targets go in this table local etal = false; -- return value set to true when we find some form of et al. vauthors parameter local last, first, link, mask, suffix; local corporate = false;
vparam, etal = name_has_etal (vparam, etal, true); -- find and remove variations on et al. do not categorize (do it here because et al. might have a period) v_name_table = get_v_name_table (vparam, v_name_table, v_link_table); -- names are separated by commas
for i, v_name in ipairs(v_name_table) do first = ; -- set to empty string for concatenation and because it may have been set for previous author/editor local accept_name; v_name, accept_name = utilities.has_accept_as_written (v_name); -- remove accept-this-as-written markup when it wraps all of
if accept_name then last = v_name; corporate = true; -- flag used in list_people elseif string.find(v_name, "%s") then if v_name:find('[;%.]') then -- look for commonly occurring punctuation characters; add_vanc_error (cfg.err_msg_supl.punctuation, i); end local lastfirstTable = lastfirstTable = mw.text.split(v_name, "%s+") first = table.remove(lastfirstTable); -- removes and returns value of last element in table which should be initials or generational suffix
if not mw.ustring.match (first, '^%u+$') then -- mw.ustring here so that later we will catch non-Latin characters suffix = first; -- not initials so assume that whatever we got is a generational suffix first = table.remove(lastfirstTable); -- get what should be the initials from the table end last = table.concat(lastfirstTable, ' ') -- returns a string that is the concatenation of all other names that are not initials and generational suffix if not utilities.is_set (last) then first = ; -- unset last = v_name; -- last empty because something wrong with first add_vanc_error (cfg.err_msg_supl.name, i); end if mw.ustring.match (last, '%a+%s+%u+%s+%a+') then add_vanc_error (cfg.err_msg_supl['missing comma'], i); -- matches last II last; the case when a comma is missing end if mw.ustring.match (v_name, ' %u %u$') then -- this test is in the wrong place TODO: move or replace with a more appropriate test add_vanc_error (cfg.err_msg_supl.initials, i); -- matches a space between two initials end else last = v_name; -- last name or single corporate name? Doesn't support multiword corporate names? do we need this? end if utilities.is_set (first) then if not mw.ustring.match (first, "^%u?%u$") then -- first shall contain one or two upper-case letters, nothing else add_vanc_error (cfg.err_msg_supl.initials, i); -- too many initials; mixed case initials (which may be ok Romanization); hyphenated initials end is_good_vanc_name (last, first, suffix, i); -- check first and last before restoring the suffix which may have a non-Latin digit if utilities.is_set (suffix) then first = first .. ' ' .. suffix; -- if there was a suffix concatenate with the initials suffix = ; -- unset so we don't add this suffix to all subsequent names end else if not corporate then is_good_vanc_name (last, , nil, i); end end
link = utilities.select_one (args, cfg.aliases[list_name .. '-Link'], 'err_redundant_parameters', i) or v_link_table[i]; mask = utilities.select_one (args, cfg.aliases[list_name .. '-Mask'], 'err_redundant_parameters', i); names[i] = ; -- add this assembled name to our names list end return names, etal; -- all done, return our list of namesend
--authors=, |authorn= / |lastn / firstn=, or |vauthors= as the source of the author name list orselect one of |editorn= / editor-lastn= / |editor-firstn= or |veditors= as the source of the editor name list.
Only one of these appropriate three will be used. The hierarchy is: |authorn= (and aliases) highest and |authors= lowest;|editorn= (and aliases) highest and |veditors= lowest (support for |editors= withdrawn)
When looking for |authorn= / |editorn= parameters, test |xxxxor1= and |xxxxor2= (and all of their aliases); stops after the secondtest which mimicks the test used in extract_names when looking for a hole in the author name list. There may be a betterway to do this, I just haven't discovered what that way is.
Emits an error message when more than one xxxxor name source is provided.
In this function, vxxxxors = vauthors or veditors; xxxxors = authors as appropriate.
local function select_author_editor_source (vxxxxors, xxxxors, args, list_name) local lastfirst = false; if utilities.select_one (args, cfg.aliases[list_name .. '-Last'], 'none', 1) or -- do this twice in case we have a |first1= without a |last1=; this ... utilities.select_one (args, cfg.aliases[list_name .. '-First'], 'none', 1) or -- ... also catches the case where |first= is used with |vauthors= utilities.select_one (args, cfg.aliases[list_name .. '-Last'], 'none', 2) or utilities.select_one (args, cfg.aliases[list_name .. '-First'], 'none', 2) then lastfirst = true; end
if (utilities.is_set (vxxxxors) and true
lastfirst and utilities.is_set (xxxxors)) then local err_name; if 'AuthorList'
if true
--
local function is_valid_parameter_value (value, name, possible, ret_val, invert) if not utilities.is_set (value) then return ret_val; -- an empty parameter is ok end
if (not invert and utilities.in_array (value, possible)) then -- normal;
----------------------------< T E R M I N A T E _ N A M E _ L I S T >----------------------------------------
This function terminates a name list (author, contributor, editor) with a separator character (sepc) and a spacewhen the last character is not a sepc character or when the last three characters are not sepc followed by twoclosing square brackets (close of a wikilink). When either of these is true, the name_list is terminated with asingle space character.
local function terminate_name_list (name_list, sepc) if (string.sub (name_list, -3, -1)
sepc) or (string.sub (name_list, -3, -1)
---------------------------< F O R M A T _ V O L U M E _ I S S U E >----------------------------------------
returns the concatenation of the formatted volume and issue parameters as a single string; or formatted volumeor formatted issue, or an empty string if neither are set.
local function format_volume_issue (volume, issue, cite_class, origin, sepc, lower) if not utilities.is_set (volume) and not utilities.is_set (issue) then return ; end-- same condition as in format_pages_sheets local is_journal = 'journal'
origin);
local is_numeric_vol = volume and (volume:match ('^[MDCLXVI]+$') or volume:match ('^%d+$')); -- is only uppercase roman numerals or only digits? local is_long_vol = volume and (4 < mw.ustring.len(volume)); -- is |volume= value longer than 4 characters? if volume and (not is_numeric_vol and is_long_vol) then -- when not all digits or Roman numerals, is |volume= longer than 4 characters? utilities.add_prop_cat ('long-vol'); -- yes, add properties cat end
if is_journal then -- journal-style formatting local vol = ; if utilities.is_set (volume) then if is_numeric_vol then -- |volume= value all digits or all uppercase Roman numerals? vol = utilities.substitute (cfg.presentation['vol-bold'],); -- render in bold face elseif is_long_vol then -- not all digits or Roman numerals; longer than 4 characters? vol = utilities.substitute (cfg.messages['j-vol'],); -- not bold else -- four or fewer characters vol = utilities.substitute (cfg.presentation['vol-bold'],); -- bold end end if utilities.is_set (issue) then return vol .. utilities.substitute (cfg.messages['j-issue'], issue); end return vol; end if 'podcast'
-- all other types of citation if utilities.is_set (volume) and utilities.is_set (issue) then return wrap_msg ('vol-no',, lower); elseif utilities.is_set (volume) then return wrap_msg ('vol',, lower); else return wrap_msg ('issue',, lower); endend
--page(s)= or |sheet(s)= values and returns it with all of the others set to empty strings.The return order is: page, pages, sheet, sheets
Singular has priority over plural when both are provided.
local function format_pages_sheets (page, pages, sheet, sheets, cite_class, origin, sepc, nopp, lower) if 'map'
origin then return , , wrap_msg ('j-sheet', sheet, lower), ; else return , , wrap_msg ('sheet',, lower), ; end elseif utilities.is_set (sheets) then if 'journal'
local is_journal = 'journal'
origin); if utilities.is_set (page) then if is_journal then return utilities.substitute (cfg.messages['j-page(s)'], page), , , ; elseif not nopp then return utilities.substitute (cfg.messages['p-prefix'],), , , ; else return utilities.substitute (cfg.messages['nopp'],), , , ; end elseif utilities.is_set (pages) then if is_journal then return utilities.substitute (cfg.messages['j-page(s)'], pages), , , ; elseif tonumber(pages) ~= nil and not nopp then -- if pages is only digits, assume a single page number return , utilities.substitute (cfg.messages['p-prefix'],), , ; elseif not nopp then return , utilities.substitute (cfg.messages['pp-prefix'],), , ; else return , utilities.substitute (cfg.messages['nopp'],), , ; end end return , , , ; -- return empty stringsend
--
local function insource_loc_get (page, page_orig, pages, pages_orig, at) local ws_url, ws_label, coins_pages, L; -- for Wikisource interwiki-links; TODO: this corrupts page metadata (span remains in place after cleanup; fix there?)
if utilities.is_set (page) then if utilities.is_set (pages) or utilities.is_set (at) then pages = ; -- unset the others at = ; end extra_text_in_page_check (page, page_orig); -- emit error message when |page= value begins with what looks like p., pp., etc.
ws_url, ws_label, L = wikisource_url_make (page); -- make ws URL from |page= interwiki link; link portion L becomes tooltip label if ws_url then page = external_link (ws_url, ws_label .. ' ', 'ws link in page'); -- space char after label to move icon away from in-source text; TODO: a better way to do this? page = utilities.substitute (cfg.presentation['interwiki-icon'],); coins_pages = ws_label; end elseif utilities.is_set (pages) then if utilities.is_set (at) then at = ; -- unset end extra_text_in_page_check (pages, pages_orig); -- emit error message when |page= value begins with what looks like p., pp., etc.
ws_url, ws_label, L = wikisource_url_make (pages); -- make ws URL from |pages= interwiki link; link portion L becomes tooltip label if ws_url then pages = external_link (ws_url, ws_label .. ' ', 'ws link in pages'); -- space char after label to move icon away from in-source text; TODO: a better way to do this? pages = utilities.substitute (cfg.presentation['interwiki-icon'],); coins_pages = ws_label; end elseif utilities.is_set (at) then ws_url, ws_label, L = wikisource_url_make (at); -- make ws URL from |at= interwiki link; link portion L becomes tooltip label if ws_url then at = external_link (ws_url, ws_label .. ' ', 'ws link in at'); -- space char after label to move icon away from in-source text; TODO: a better way to do this? at = utilities.substitute (cfg.presentation['interwiki-icon'],); coins_pages = ws_label; end end return page, pages, at, coins_pages;end
--archive-url= value is same as |url= or chapter-url= (or alias...) value
local function is_unique_archive_url (archive, url, c_url, source, date) if utilities.is_set (archive) then if archive
c_url then utilities.set_message ('err_bad_url',); -- add error message return , ; -- unset |archive-url= and |archive-date= because same as |url= or |chapter-url= end end
return archive, date;end
--[=[-------------------------< A R C H I V E _ U R L _ C H E C K >-------------------------------------------- Check archive.org URLs to make sure they at least look like they are pointing at valid archives and not to the save snapshot URL or to calendar pages. When the archive URL is 'https://web.archive.org/save/' (or http://...) archive.org saves a snapshot of the target page in the URL. That is something that Wikipedia should not allow unwitting readers to do. When the archive.org URL does not have a complete timestamp, archive.org chooses a snapshot according to its own algorithm or provides a calendar 'search' result. [[WP:ELNO]] discourages links to search results.
This function looks at the value assigned to |archive-url= and returns empty strings for |archive-url= and|archive-date= and an error message when: |archive-url= holds an archive.org save command URL |archive-url= is an archive.org URL that does not have a complete timestamp (YYYYMMDDhhmmss 14 digits) in the correct placeotherwise returns |archive-url= and |archive-date=
There are two mostly compatible archive.org URLs: //web.archive.org/
The old form does not support or map to the new form when it contains a display flag. There are four identified flags('id_', 'js_', 'cs_', 'im_') but since archive.org ignores others following the same form (two letters and an underscore)we don't check for these specific flags but we do check the form.
This function supports a preview mode. When the article is rendered in preview mode, this function may return a modifiedarchive URL: for save command errors, return undated wildcard (/*/) for timestamp errors when the timestamp has a wildcard, return the URL unmodified for timestamp errors when the timestamp does not have a wildcard, return with timestamp limited to six digits plus wildcard (/yyyymm*/)
]=]
local function archive_url_check (url, date) local err_msg = ; -- start with the error message empty local path, timestamp, flag; -- portions of the archive.org URL if (not url:match('//web%.archive%.org/')) and (not url:match('//liveweb%.archive%.org/')) then -- also deprecated liveweb Wayback machine URL return url, date; -- not an archive.org archive, return ArchiveURL and ArchiveDate end
if url:match('//web%.archive%.org/save/') then -- if a save command URL, we don't want to allow saving of the target page err_msg = cfg.err_msg_supl.save; url = url:gsub ('(//web%.archive%.org)/save/', '%1/*/', 1); -- for preview mode: modify ArchiveURL elseif url:match('//liveweb%.archive%.org/') then err_msg = cfg.err_msg_supl.liveweb; else path, timestamp, flag = url:match('//web%.archive%.org/([^%d]*)(%d+)([^/]*)/'); -- split out some of the URL parts for evaluation if not path then -- malformed in some way; pattern did not match err_msg = cfg.err_msg_supl.timestamp; elseif 14 ~= timestamp:len then -- path and flag optional, must have 14-digit timestamp here err_msg = cfg.err_msg_supl.timestamp; if '*' ~= flag then local replacement = timestamp:match ('^%d%d%d%d%d%d') or timestamp:match ('^%d%d%d%d'); -- get the first 6 (YYYYMM) or first 4 digits (YYYY) if replacement then -- nil if there aren't at least 4 digits (year) replacement = replacement .. string.rep ('0', 14 - replacement:len); -- year or yearmo (4 or 6 digits) zero-fill to make 14-digit timestamp url=url:gsub ('(//web%.archive%.org/[^%d]*)%d[^/]*', '%1' .. replacement .. '*', 1) -- for preview, modify ts to 14 digits plus splat for calendar display end end elseif utilities.is_set (path) and 'web/' ~= path then -- older archive URLs do not have the extra 'web/' path element err_msg = cfg.err_msg_supl.path; elseif utilities.is_set (flag) and not utilities.is_set (path) then -- flag not allowed with the old form URL (without the 'web/' path element) err_msg = cfg.err_msg_supl.flag; elseif utilities.is_set (flag) and not flag:match ('%a%a_') then -- flag if present must be two alpha characters and underscore (requires 'web/' path element) err_msg = cfg.err_msg_supl.flag; else return url, date; -- return ArchiveURL and ArchiveDate end end -- if here, something not right so utilities.set_message ('err_archive_url',); -- add error message and
if is_preview_mode then return url, date; -- preview mode so return ArchiveURL and ArchiveDate else return , ; -- return empty strings for ArchiveURL and ArchiveDate endend
--place=, |publication-place=, |location= to see if these params include digits. This function added becausemany editors misuse location to specify the in-source location (|page(s)= and |at= are supposed to do that)
returns the original parameter value without modification; added maint cat when parameter value contains digits
local function place_check (param_val) if not utilities.is_set (param_val) then -- parameter empty or omitted return param_val; -- return that empty state end if mw.ustring.find (param_val, '%d') then -- not empty, are there digits in the parameter value utilities.set_message ('maint_location'); -- yep, add maint cat end return param_val; -- and doneend
--title= to 'Archived copy' (placeholder added by bots that can't find proper title); if matches, return true; nil else
local function is_archived_copy (title) title = mw.ustring.lower(title); -- switch title to lower case if title:find (cfg.special_case_translation.archived_copy.en) then -- if title is 'Archived copy' return true; elseif cfg.special_case_translation.archived_copy['local'] then if mw.ustring.find (title, cfg.special_case_translation.archived_copy['local']) then -- mw.ustring because might not be Latin script return true; end endend
----------------------------< C I T A T I O N 0 >------------------------------------------------------------
This is the main function doing the majority of the citation formatting.
local function citation0(config, args) --Load Input Parameters The argument_wrapper facilitates the mapping of multiple aliases to single internal variable. local A = argument_wrapper (args); local i
-- Pick out the relevant fields from the arguments. Different citation templates -- define different field names for the same underlying things.
local author_etal; local a = ; -- authors list from |lastn= / |firstn= pairs or |vauthors= local Authors; local NameListStyle = is_valid_parameter_value (A['NameListStyle'], A:ORIGIN('NameListStyle'), cfg.keywords_lists['name-list-style'], ); local Collaboration = A['Collaboration'];
do -- to limit scope of selected local selected = select_author_editor_source (A['Vauthors'], A['Authors'], args, 'AuthorList'); if 1
selected then NameListStyle = 'vanc'; -- override whatever |name-list-style= might be a, author_etal = parse_vauthors_veditors (args, args.vauthors, 'AuthorList'); -- fetch author list from |vauthors=, |author-linkn=, and |author-maskn= elseif 3
A:ORIGIN('Authors') then -- but add a maint cat if the parameter is |authors= utilities.set_message ('maint_authors'); -- because use of this parameter is discouraged; what to do about the aliases is a TODO: end end if utilities.is_set (Collaboration) then author_etal = true; -- so that |display-authors=etal not required end end
local editor_etal; local e = ; -- editors list from |editor-lastn= / |editor-firstn= pairs or |veditors=
do -- to limit scope of selected local selected = select_author_editor_source (A['Veditors'], nil, args, 'EditorList'); -- support for |editors= withdrawn if 1
selected then NameListStyle = 'vanc'; -- override whatever |name-list-style= might be e, editor_etal = parse_vauthors_veditors (args, args.veditors, 'EditorList'); -- fetch editor list from |veditors=, |editor-linkn=, and |editor-maskn= end end local Chapter = A['Chapter']; -- done here so that we have access to |contribution= from |chapter= aliases local Chapter_origin = A:ORIGIN ('Chapter'); local Contribution; -- because contribution is required for contributor(s) if 'contribution'
#a then -- |contributor= requires |author= utilities.set_message ('err_contributor_missing_required_param', 'author'); -- add missing author error message c = ; -- blank the contributors' table; it is used as a flag later end end else -- if not a book cite if utilities.select_one (args, cfg.aliases['ContributorList-Last'], 'err_redundant_parameters', 1) then -- are there contributor name list parameters? utilities.set_message ('err_contributor_ignored'); -- add contributor ignored error message end Contribution = nil; -- unset end
local Title = A['Title']; local TitleLink = A['TitleLink'];
local auto_select = ; -- default is auto local accept_link; TitleLink, accept_link = utilities.has_accept_as_written (TitleLink, true); -- test for accept-this-as-written markup if (not accept_link) and utilities.in_array (TitleLink,) then -- check for special keywords auto_select = TitleLink; -- remember selection for later TitleLink = ; -- treat as if |title-link= would have been empty end
TitleLink = link_title_ok (TitleLink, A:ORIGIN ('TitleLink'), Title, 'title'); -- check for wiki-markup in |title-link= or wiki-markup in |title= when |title-link= is set
local Section = ; -- only; preset to empty string for concatenation if not used if 'map'
Chapter_origin then Section = A['Chapter']; -- get |section= from |chapter= alias list; |chapter= and the other aliases not supported in Chapter = ; -- unset for now; will be reset later from |map= if present end
local Periodical = A['Periodical']; local Periodical_origin = ; if utilities.is_set (Periodical) then Periodical_origin = A:ORIGIN('Periodical'); -- get the name of the periodical parameter local i; Periodical, i = utilities.strip_apostrophe_markup (Periodical); -- strip apostrophe markup so that metadata isn't contaminated if i then -- non-zero when markup was stripped so emit an error message utilities.set_message ('err_apostrophe_markup',); end end
if 'mailinglist'
Periodical = A ['MailingList']; -- error or no, set Periodical to |mailinglist= value because this template is Periodical_origin = A:ORIGIN('MailingList'); end
local ScriptPeriodical = A['ScriptPeriodical'];
-- web and news not tested for now because of -- Wikipedia:Administrators%27_noticeboard#Is_there_a_semi-automated_tool_that_could_fix_these_annoying_"Cite_Web"_errors? if not (utilities.is_set (Periodical) or utilities.is_set (ScriptPeriodical)) then -- 'periodical' templates require periodical parameter -- local p = ; -- for error message local p = ; -- for error message if p[config.CitationClass] then utilities.set_message ('err_missing_periodical',); end end local Volume; local ScriptPeriodical_origin = A:ORIGIN('ScriptPeriodical'); if 'citation'
local Issue; if 'citation'
local Page; local Pages; local At; if not utilities.in_array (config.CitationClass, cfg.templates_not_using_page) then Page = A['Page']; Pages = utilities.hyphen_to_dash (A['Pages']); At = A['At']; end
local Edition = A['Edition']; local PublicationPlace = place_check (A['PublicationPlace'], A:ORIGIN('PublicationPlace')); local Place = place_check (A['Place'], A:ORIGIN('Place')); local PublisherName = A['PublisherName']; local PublisherName_origin = A:ORIGIN('PublisherName'); if utilities.is_set (PublisherName) then local i = 0; PublisherName, i = utilities.strip_apostrophe_markup (PublisherName); -- strip apostrophe markup so that metadata isn't contaminated; publisher is never italicized if i then -- non-zero when markup was stripped so emit an error message utilities.set_message ('err_apostrophe_markup',); end end
local Newsgroup = A['Newsgroup']; -- TODO: strip apostrophe markup? local Newsgroup_origin = A:ORIGIN('Newsgroup');
if 'newsgroup'
PublisherName = nil; -- ensure that this parameter is unset for the time being; will be used again after COinS end
local URL = A['URL']; -- TODO: better way to do this for URL, ChapterURL, and MapURL? local UrlAccess = is_valid_parameter_value (A['UrlAccess'], A:ORIGIN('UrlAccess'), cfg.keywords_lists['url-access'], nil); if not utilities.is_set (URL) and utilities.is_set (UrlAccess) then UrlAccess = nil; utilities.set_message ('err_param_access_requires_param', 'url'); end local ChapterURL = A['ChapterURL']; local ChapterUrlAccess = is_valid_parameter_value (A['ChapterUrlAccess'], A:ORIGIN('ChapterUrlAccess'), cfg.keywords_lists['url-access'], nil); if not utilities.is_set (ChapterURL) and utilities.is_set (ChapterUrlAccess) then ChapterUrlAccess = nil; utilities.set_message ('err_param_access_requires_param',); end
local MapUrlAccess = is_valid_parameter_value (A['MapUrlAccess'], A:ORIGIN('MapUrlAccess'), cfg.keywords_lists['url-access'], nil); if not utilities.is_set (A['MapURL']) and utilities.is_set (MapUrlAccess) then MapUrlAccess = nil; utilities.set_message ('err_param_access_requires_param',); end
----Disable tracking_cats for our testing --
local this_page = mw.title.getCurrentTitle; -- also used for COinS and for language local no_tracking_cats = true -- is_valid_parameter_value (A['NoTracking'], A:ORIGIN('NoTracking'), cfg.keywords_lists['yes_true_y'], nil);
-- check this page to see if it is in one of the namespaces that cs1 is not supposed to add to the error categories if not utilities.is_set (no_tracking_cats) then -- ignore if we are already not going to categorize this page if utilities.in_array (this_page.nsText, cfg.uncategorized_namespaces) then no_tracking_cats = "true"; -- set no_tracking_cats end for _, v in ipairs (cfg.uncategorized_subpages) do -- cycle through page name patterns if this_page.text:match (v) then -- test page name against each pattern no_tracking_cats = "true"; -- set no_tracking_cats break; -- bail out if one is found end end end -- check for extra |page=, |pages= or |at= parameters. (also sheet and sheets while we're at it) utilities.select_one (args,, 'err_redundant_parameters'); -- this is a dummy call simply to get the error message and category
local coins_pages; Page, Pages, At, coins_pages = insource_loc_get (Page, A:ORIGIN('Page'), Pages, A:ORIGIN('Pages'), At);
local NoPP = is_valid_parameter_value (A['NoPP'], A:ORIGIN('NoPP'), cfg.keywords_lists['yes_true_y'], nil);
if utilities.is_set (PublicationPlace) and utilities.is_set (Place) then -- both |publication-place= and |place= (|location=) allowed if different utilities.add_prop_cat ('location-test'); -- add property cat to evaluate how often PublicationPlace and Place are used together if PublicationPlace
if PublicationPlace
|trans-title= maps to |trans-chapter= when |title= is re-mapped |url= maps to |chapter-url= when |title= is remapped All other combinations of |encyclopedia=, |title=, and |article= are not modified
local Encyclopedia = A['Encyclopedia']; -- used as a flag by this module and by ~/COinS
if utilities.is_set (Encyclopedia) then -- emit error message when Encyclopedia set but template is other than Encyclopedia: or if 'encyclopaedia' ~= config.CitationClass and 'citation' ~= config.CitationClass then utilities.set_message ('err_parameter_ignored',); Encyclopedia = nil; -- unset because not supported by this template end end
if ('encyclopaedia'
config.CitationClass and utilities.is_set (Encyclopedia)) then if utilities.is_set (Periodical) and utilities.is_set (Encyclopedia) then -- when both set emit an error TODO: make a function for this and similar? utilities.set_message ('err_redundant_parameters',); end
if utilities.is_set (Encyclopedia) then Periodical = Encyclopedia; -- error or no, set Periodical to Encyclopedia; allow periodical without encyclopedia Periodical_origin = A:ORIGIN ('Encyclopedia'); end
if utilities.is_set (Periodical) then -- Periodical is set when |encyclopedia= is set if utilities.is_set (Title) or utilities.is_set (ScriptTitle) then if not utilities.is_set (Chapter) then Chapter = Title; -- |encyclopedia= and |title= are set so map |title= to |article= and |encyclopedia= to |title= ScriptChapter = ScriptTitle; ScriptChapter_origin = A:ORIGIN('ScriptTitle') TransChapter = TransTitle; ChapterURL = URL; ChapterURL_origin = URL_origin;
ChapterUrlAccess = UrlAccess;
if not utilities.is_set (ChapterURL) and utilities.is_set (TitleLink) then Chapter = utilities.make_wikilink (TitleLink, Chapter); end Title = Periodical; ChapterFormat = Format; Periodical = ; -- redundant so unset TransTitle = ; URL = ; Format = ; TitleLink = ; ScriptTitle = ; end elseif utilities.is_set (Chapter) or utilities.is_set (ScriptChapter) then -- |title= not set Title = Periodical; -- |encyclopedia= set and |article= set so map |encyclopedia= to |title= Periodical = ; -- redundant so unset end end end
-- special case for cite techreport. local ID = A['ID']; if (config.CitationClass
-- Account for the oddity that is, before generation of COinS data. local ChapterLink -- = A['ChapterLink']; -- deprecated as a parameter but still used internally by cite episode local Conference = A['Conference']; local BookTitle = A['BookTitle']; local TransTitle_origin = A:ORIGIN ('TransTitle'); if 'conference'
','); -- cite map oddities local Cartography = ""; local Scale = ""; local Sheet = A['Sheet'] or ; local Sheets = A['Sheets'] or ; if config.CitationClass
ChapterUrlAccess = MapUrlAccess; ChapterFormat = A['MapFormat'];
Cartography = A['Cartography']; if utilities.is_set (Cartography) then Cartography = sepc .. " " .. wrap_msg ('cartography', Cartography, use_lowercase); end Scale = A['Scale']; if utilities.is_set (Scale) then Scale = sepc .. " " .. Scale; end end
-- Account for the oddities that are and, before generation of COinS data. local Series = A['Series']; if 'episode'
config.CitationClass then local SeriesLink = A['SeriesLink'];
SeriesLink = link_title_ok (SeriesLink, A:ORIGIN ('SeriesLink'), Series, 'series'); -- check for wiki-markup in |series-link= or wiki-markup in |series= when |series-link= is set
local Network = A['Network']; local Station = A['Station']; local s, n =, ; -- do common parameters first if utilities.is_set (Network) then table.insert(n, Network); end if utilities.is_set (Station) then table.insert(n, Station); end ID = table.concat(n, sepc .. ' '); if 'episode'
if utilities.is_set (Season) and utilities.is_set (SeriesNumber) then -- these are mutually exclusive so if both are set TODO: make a function for this and similar? utilities.set_message ('err_redundant_parameters',); -- add error message SeriesNumber = ; -- unset; prefer |season= over |seriesno= end -- assemble a table of parts concatenated later into Series if utilities.is_set (Season) then table.insert(s, wrap_msg ('season', Season, use_lowercase)); end if utilities.is_set (SeriesNumber) then table.insert(s, wrap_msg ('seriesnum', SeriesNumber, use_lowercase)); end if utilities.is_set (Issue) then table.insert(s, wrap_msg ('episode', Issue, use_lowercase)); end Issue = ; -- unset because this is not a unique parameter Chapter = Title; -- promote title parameters to chapter ScriptChapter = ScriptTitle; ScriptChapter_origin = A:ORIGIN('ScriptTitle'); ChapterLink = TitleLink; -- alias |episode-link= TransChapter = TransTitle; ChapterURL = URL; ChapterUrlAccess = UrlAccess; ChapterURL_origin = URL_origin; ChapterFormat = Format;
Title = Series; -- promote series to title TitleLink = SeriesLink; Series = table.concat(s, sepc .. ' '); -- this is concatenation of season, seriesno, episode number
if utilities.is_set (ChapterLink) and not utilities.is_set (ChapterURL) then -- link but not URL Chapter = utilities.make_wikilink (ChapterLink, Chapter); elseif utilities.is_set (ChapterLink) and utilities.is_set (ChapterURL) then -- if both are set, URL links episode; Series = utilities.make_wikilink (ChapterLink, Series); end URL = ; -- unset TransTitle = ; ScriptTitle = ; Format = ; else -- now oddities that are cite serial Issue = ; -- unset because this parameter no longer supported by the citation/core version of cite serial Chapter = A['Episode']; -- TODO: make |episode= available to cite episode someday? if utilities.is_set (Series) and utilities.is_set (SeriesLink) then Series = utilities.make_wikilink (SeriesLink, Series); end Series = utilities.wrap_style ('italic-title', Series); -- series is italicized end end -- end of stuff
-- handle type parameter for those CS1 citations that have default values local TitleType = A['TitleType']; local Degree = A['Degree']; if utilities.in_array (config.CitationClass,) then TitleType = set_titletype (config.CitationClass, TitleType); if utilities.is_set (Degree) and "Thesis"
if utilities.is_set (TitleType) then -- if type parameter is specified TitleType = utilities.substitute (cfg.messages['type'], TitleType); -- display it in parentheses -- TODO: Hack on TitleType to fix bunched parentheses problem end
-- legacy: promote PublicationDate to Date if neither Date nor Year are set. local Date = A['Date']; local Date_origin; -- to hold the name of parameter promoted to Date; required for date error messaging local PublicationDate = A['PublicationDate']; local Year = A['Year'];
if not utilities.is_set (Date) then Date = Year; -- promote Year to Date Year = nil; -- make nil so Year as empty string isn't used for CITEREF if not utilities.is_set (Date) and utilities.is_set (PublicationDate) then -- use PublicationDate when |date= and |year= are not set Date = PublicationDate; -- promote PublicationDate to Date PublicationDate = ; -- unset, no longer needed Date_origin = A:ORIGIN('PublicationDate'); -- save the name of the promoted parameter else Date_origin = A:ORIGIN('Year'); -- save the name of the promoted parameter end else Date_origin = A:ORIGIN('Date'); -- not a promotion; name required for error messaging end
if PublicationDate
--
local DF = is_valid_parameter_value (A['DF'], A:ORIGIN('DF'), cfg.keywords_lists['df'], ); if not utilities.is_set (DF) then DF = cfg.global_df; -- local |df= if present overrides global df set by template end
local ArchiveURL; local ArchiveDate; local ArchiveFormat = A['ArchiveFormat'];
ArchiveURL, ArchiveDate = archive_url_check (A['ArchiveURL'], A['ArchiveDate']) ArchiveFormat = style_format (ArchiveFormat, ArchiveURL, 'archive-format', 'archive-url'); ArchiveURL, ArchiveDate = is_unique_archive_url (ArchiveURL, URL, ChapterURL, A:ORIGIN('ArchiveURL'), ArchiveDate); -- add error message when URL or ChapterURL
local AccessDate = A['AccessDate']; local LayDate = A['LayDate']; local COinS_date = ; -- holds date info extracted from |date= for the COinS metadata by Module:Date verification local DoiBroken = A['DoiBroken']; local Embargo = A['Embargo']; local anchor_year; -- used in the CITEREF identifier do -- create defined block to contain local variables error_message, date_parameters_list, mismatch local error_message = ; -- AirDate has been promoted to Date so not necessary to check it local date_parameters_list = ;
local error_list = ; anchor_year, Embargo = validation.dates(date_parameters_list, COinS_date, error_list);
-- start temporary Julian / Gregorian calendar uncertainty categorization if COinS_date.inter_cal_cat then utilities.add_prop_cat ('jul-greg-uncertainty'); end-- end temporary Julian / Gregorian calendar uncertainty categorization
if utilities.is_set (Year) and utilities.is_set (Date) then -- both |date= and |year= not normally needed; validation.year_date_check (Year, A:ORIGIN ('Year'), Date, A:ORIGIN ('Date'), error_list); end if 0
if true
if modified then -- if the date_parameters_list values were modified AccessDate = date_parameters_list['access-date'].val; -- overwrite date holding parameters with modified values ArchiveDate = date_parameters_list['archive-date'].val; Date = date_parameters_list['date'].val; DoiBroken = date_parameters_list['doi-broken-date'].val; LayDate = date_parameters_list['lay-date'].val; PublicationDate = date_parameters_list['publication-date'].val; end else utilities.set_message ('err_bad_date',); -- add this error message end end -- end of do
local ID_list = ; -- sequence table of rendered identifiers local ID_list_coins = ; -- table of identifiers and their values from args; key is same as cfg.id_handlers's key local Class = A['Class']; -- arxiv class identifier local ID_support =
ID_list, ID_list_coins = identifiers.identifier_lists_get (args,, ID_support);
-- Account for the oddities that are,,,, before generation of COinS data. if utilities.in_array (config.CitationClass, whitelist.preprint_template_list) then if not utilities.is_set (ID_list_coins[config.CitationClass:upper]) then -- |arxiv= or |eprint= required for cite arxiv; |biorxiv= & |citeseerx= required for their templates utilities.set_message ('err_' .. config.CitationClass .. '_missing'); -- add error message end
Periodical = [config.CitationClass]; end
-- Link the title of the work if no |url= was provided, but we have a |pmc= or a |doi= with |doi-access=free
if config.CitationClass
if utilities.is_set (URL) then -- set when using an identifier-created URL if utilities.is_set (AccessDate) then -- |access-date= requires |url=; identifier-created URL is not |url= utilities.set_message ('err_accessdate_missing_url'); -- add an error message AccessDate = ; -- unset end
if utilities.is_set (ArchiveURL) then -- |archive-url= requires |url=; identifier-created URL is not |url= utilities.set_message ('err_archive_missing_url'); -- add an error message ArchiveURL = ; -- unset end end end
-- At this point fields may be nil if they weren't specified in the template use. We can use that fact. -- Test if citation has no title if not utilities.is_set (Title) and not utilities.is_set (TransTitle) and not utilities.is_set (ScriptTitle) then -- has special case for cite episode utilities.set_message ('err_citation_missing_title',); end
if utilities.in_array (cfg.keywords_xlate[Title],) and utilities.in_array (config.CitationClass,) and (utilities.is_set (Periodical) or utilities.is_set (ScriptPeriodical)) and ('journal'
ScriptPeriodical_origin) then -- special case for journal cites Title = ; -- set title to empty string utilities.set_message ('maint_untitled'); -- add maint cat end
-- COinS metadata (see
config.CitationClass and utilities.is_set (Encyclopedia)) then if utilities.is_set (Chapter) and utilities.is_set (Title) and utilities.is_set (Periodical) then -- if all are used then coins_chapter = Title; -- remap coins_title = Periodical; end end local coins_author = a; -- default for coins rft.au if 0 < #c then -- but if contributor list coins_author = c; -- use that instead end local QuotePage = A['QuotePage']; local QuotePages = utilities.hyphen_to_dash (A['QuotePages']);
-- this is the function call to COinS local OCinSoutput = metadata.COinS(config.CitationClass);
-- Account for the oddities that are,,, and AFTER generation of COinS data. if utilities.in_array (config.CitationClass, whitelist.preprint_template_list) then -- we have set rft.jtitle in COinS to arXiv, bioRxiv, CiteSeerX, or ssrn now unset so it isn't displayed Periodical = ; -- periodical not allowed in these templates; if article has been published, use cite journal end
-- special case for cite newsgroup. Do this after COinS because we are modifying Publishername to include some static text if 'newsgroup'
local Editors; local EditorCount; -- used only for choosing