User:JPxG/bigindex

From Wikipedia, the free encyclopedia
user · talk · index (big! · prefix) · pix · bot · logs (CSD · XfD · PROD) · V / E
Drafts: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · 24 · 25 · 26 · 27 · 28 · 29 ·2 · 3 · 4
Sandbice: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 66 · 88 · 89 · 98 · 99

User:JPxG/Allylurea[edit]

user · talk · index (big! · prefix) · pix · bot · logs (CSD · XfD · PROD) · V / E
Drafts: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · 24 · 25 · 26 · 27 · 28 · 29 ·2 · 3 · 4
Sandbice: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 66 · 88 · 89 · 98 · 99

Allylurea is a chemical compound which is most known for User:JPxG being too lazy to ever get around to writing an article about it.

User:JPxG/Bi-LSTM[edit]

user · talk · index (big! · prefix) · pix · bot · logs (CSD · XfD · PROD) · V / E
Drafts: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · 24 · 25 · 26 · 27 · 28 · 29 ·2 · 3 · 4
Sandbice: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 66 · 88 · 89 · 98 · 99

Bidirectional long short-term memory is an artificial neural network architecture used for deep learning.

Background[edit]

Since the origin of computing, artificial intelligence has been an object of study, but during the second half of the 20th century, processing power became more easily accessible and computer-based research became more commonplace. The term "machine learning", used as early as 1959 by IBM researcher Arthur Samuel,[1] currently encompasses a broad variety of statistical learning, data science and neural network approaches to computational problems (often falling under the aegis of artificial intelligence). The first neural network, the perceptron, was introduced in 1957 by Frank Rosenblatt.[2] This machine attempted to recognize pictures and classify them into categories. It consisted of a network of "input neurons" and "output neurons"; each input neuron was connected to every single output neuron, with "weights" (set with potentiometers) determining the strength of each connection's influence on output.[3] The architecture of Rosenblatt's perceptron is what would now be referred to as a fully-connected single-layer feed-forward neural network (FFNN). Since then, many different innovations have occurred, the most significant being the development of deep learning models in which one or more "layers" of neurons exists between the input and output.[4][5]

Neural networks are typically initialized with random weights, and "trained" to give consistently correct output for a known dataset (the "training set") using backpropagation to perform gradient descent, in which a system of equations is used to determine the optimal adjustment of all weights in the entire network for a given input/output example.[5][4] In traditional feed-forward neural networks (like Rosenblatt's perceptron), each layer processes output from the previous layer only. Information does not flow backwards, which means that its structure contains no "cycles".[4] In contrast, a recurrent neural network (RNN) has at least one "cycle" of activation flow, where neurons can be activated by neurons in subsequent layers.[4]

RNNs, unlike FFNNs, are suited to processing sequential data, since they are capable of encoding different weights (and producing different output) for the same input based on previous activation states. That is to say, a text-prediction model using recurrence could process the string "The dog ran out of the house, down the street, loudly" and produce "barking", while producing "meowing" for the same input sequence featuring "cat" in the place of "dog". Achieving the same output from a purely feed-forward neural network, on the other hand, would require separate activation pathways to be trained for both sentences in their entirety.[6][7]

However, RNNs and FFNNs are both vulnerable to the "vanishing gradient problem"; since gradients (stored as numbers of finite precision) must be backpropagated over every layer of a model to train it, a model with a large number of layers tends to see gradients "vanish" to zero or "explode" to infinity before getting all the way across. To resolve this problem, long short-term memory (LSTM) models were introduced by Sepp Hochreiter and Jürgen Schmidhuber in 1995—1997, featuring a novel architecture of multiple distinct "cells" with "input", "output" and "forget" gates.[8][9][10]. LSTMs would find use in a variety of tasks that RNNs performed poorly at, like learning fine distinctions between rhythmic pattern sequences.[11]

While LSTMs proved useful for a variety of applications, like handwriting recognition,[12] they remained limited in their ability to process context; a unidirectional RNN or LSTM's output can only be influenced by previous sequence items.[6] Similar to how the history of the Roman Empire is contextualized by its decline, earlier items in a sequence of images or words tend to take on different meanings based on later items. One example is the following sentence:

He loved his bird more than anything, and cared for it well, and was very distraught to find it had a broken propeller.

Here, the "bird" is being used as a slang term for an airplane, but this only becomes apparent upon parsing the last word ("propeller"). While a human reading this sentence can update their interpretation of the first part after reading the second, a unidirectional neural network (whether feedforward, recurrent, or LSTM) cannot.[6] To provide this capability, bidirectional LSTMs were created. Bidirectional RNNs were first described in 1997 by Schuster and Paliwal as an extension of RNNs.[13]

NLP crap[edit]

Bidirectional algorithms have long been used in domains outside of deep learning; in 2011, the state of the art in part-of-speech (POS) tagging classifiers consisted of classifiers trained on windows of text which then fed into bidirectional decoding algorithms during inference; Collobert et al. cited examples of high-performance POS tagging systems whose decoding systems' bidirectionality was instantiated in dependency networks and Viterbi decoders.[14]


2015, Wang et al, unified tagging solution using bi-lstm rnn with word embedding[15]

2015, Ling et al, compositional character models for open vocabulary word representation[16]

2015, Kawakami et al, representing words in context with multilingual supervision[17]

2016, Li et al, sentence relation modeling with auxiliary character-level embedding[18]

Speech / handwriting[edit]

In a 2005 paper, Graves et al. used bidirectional LSTMs for improved phoneme classification and recognition.[19]

In a 2013 paper, Graves et al. used deep bidirectional LSTM for hybrid speech recognition.[20]

2014, Zhang et al, distant speech recognition using Highway LSTM RNNs[21]

2016, Zayats et al, disfluency detection with bi-LSTM[22]


In a 2007 paper, Liwicki et al. did a heckin novel approach to on-line handwriting recognition based on bi-LSTM.[23]

Sequence tagging[edit]

2015, Huang et al, bi-LSTM-CRFs for sequence tagging[24]

Else[edit]

2016, Kiperwasser et al., dependency parsing using bi-LSTM feature representations[25]

2016, Zhang et al., driving behavior recognition model with multi-scale cnn and bi-lstm[26]

2021, Deligiannidis et al. analyzed performance versus complexity of bi-RNNs versus volterra nonlinear equalizers in digital coherent systems.[27]

2021, Oluwalade et al, human activity recognition using smartphone and smartwatch sensor data[28]

2021, Dang et al, lstm models for malware classification (are they bidirectional?)[29]

2016, Lample et al, neural architectures for named entity recognition[30]

2016, Wang et al, image captioning with bi-LSTM[31]



References[edit]

  1. ^ Samuel, Arthur (1959). "Some Studies in Machine Learning Using the Game of Checkers". IBM Journal of Research and Development. 3 (3): 210–229. CiteSeerX 10.1.1.368.2254. doi:10.1147/rd.33.0210.
  2. ^ Rosenblatt, Frank (1957). "The Perceptron—a perceiving and recognizing automaton". Report 85-460-1. Cornell Aeronautical Laboratory.
  3. ^ Bishop, Christopher M. (2006). Pattern Recognition and Machine Learning. Springer. ISBN 0-387-31073-8.
  4. ^ a b c d Wilson, Bill (24 June 2012). "The Machine Learning Dictionary". www.cse.unsw.edu.au. Archived from the original on 26 August 2018. Retrieved 19 January 2021.
  5. ^ a b Goodfellow, Ian; Bengio, Yoshua; Courville, Aaron (2016). "6.5 Back-Propagation and Other Differentiation Algorithms". Deep Learning. MIT Press. pp. 200–220. ISBN 9780262035613. Archived from the original on 2018-01-27. Retrieved 2021-03-16.
  6. ^ a b c Bajpai, Akash (23 February 2019). "Recurrent Neural Networks: Deep Learning for NLP". Towards Data Science. Archived from the original on 16 March 2021. Retrieved 19 January 2021.
  7. ^ Olah, Chris; Carter, Shan (8 September 2016). "Attention and Augmented Recurrent Neural Networks". Distill. Archived from the original on 22 December 2020. Retrieved 22 January 2021.
  8. ^ Sepp Hochreiter; Jürgen Schmidhuber (21 August 1995), Long Short Term Memory, Wikidata Q98967430
  9. ^ Sepp Hochreiter; Jürgen Schmidhuber (1997). "LSTM can Solve Hard Long Time Lag Problems" (PDF). Advances in Neural Information Processing Systems 9. Advances in Neural Information Processing Systems. Wikidata Q77698282.
  10. ^ Sepp Hochreiter; Jürgen Schmidhuber (1997). "Long short-term memory". Neural Computation. 9 (8): 1735–1780. doi:10.1162/neco.1997.9.8.1735. PMID 9377276. S2CID 1915014. Archived from the original on 2021-01-22. Retrieved 2021-03-08.
  11. ^ Felix A., Gers; Nicol N., Schraudolph; Jürgen, Schmidhuber (2002). "Learning precise timing with LSTM recurrent networks". Journal of Machine Learning Research. 3 (1): 115–143. Archived from the original on 2019-04-04. Retrieved 2021-03-16.
  12. ^ Graves, A.; Liwicki, M.; Fernández, S.; Bertolami, R.; Bunke, H.; Schmidhuber, J. (May 2009). "A Novel Connectionist System for Unconstrained Handwriting Recognition". IEEE Transactions on Pattern Analysis and Machine Intelligence. 31 (5): 855–868. CiteSeerX 10.1.1.139.4502. doi:10.1109/tpami.2008.137. ISSN 0162-8828. PMID 19299860. S2CID 14635907.
  13. ^ Schuster, Mike; Paliwal, Kuldip K. (December 1997). "Bidirectional Recurrent Neural Networks". IEEE Transactions on Signal Processing. 45 (11): 2673–2681. Bibcode:1997ITSP...45.2673S. doi:10.1109/78.650093. S2CID 18375389.
  14. ^ Collobert, Ronan; Weston, Jason; Bottou, Leon; Karlen, Michael; Kavukcuoglu, Koray; Kuksa, Pavel (2011). "Natural Language Processing (Almost) from Scratch" (PDF). Journal of Machine Learning Research. 12: 2493–2537. Archived (PDF) from the original on 2020-12-10. Retrieved 2021-03-08.
  15. ^ Wang, Peilu; Qian, Yao; Soong, Frank K.; He, Lei; Zhao, Hai (2015). "A Unified Tagging Solution: Bidirectional LSTM Recurrent Neural Network with Word Embedding". arXiv:1511.00215 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  16. ^ Ling, Wang; Luís, Tiago; Marujo, Luís; Ramón Fernandez Astudillo; Amir, Silvio; Dyer, Chris; Black, Alan W.; Trancoso, Isabel (2015). "Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation". arXiv:1508.02096 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  17. ^ Kawakami, Kazuya; Dyer, Chris (2015). "Learning to Represent Words in Context with Multilingual Supervision". arXiv:1511.04623 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  18. ^ Li, Peng; Huang, Heng (2016). "Enhancing Sentence Relation Modeling with Auxiliary Character-level Embedding". arXiv:1603.09405 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  19. ^ Graves, Alex; Fernández, Santiago; Schmidhuber, Jürgen (2005). Bidirectional LSTM networks for improved phoneme classification and recognition (PDF). Springer Berlin Heidelberg. pp. 799–804. Archived (PDF) from the original on 2019-07-07. Retrieved 2021-03-08. {{cite book}}: |work= ignored (help)
  20. ^ Graves, Alan; Jaitly, Navdeep; Mohamed, Abdel-rahman (2013). "Hybrid speech recognition with deep bidirectional LSTM" (PDF). 2013 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU). Archived (PDF) from the original on 2020-06-05. Retrieved 2021-03-08. {{cite journal}}: Cite journal requires |journal= (help)
  21. ^ Zhang, Yu; Chen, Guoguo; Yu, Dong; Yao, Kaisheng; Khudanpur, Sanjeev; Glass, James (2015). "Highway Long Short-Term Memory RNNS for Distant Speech Recognition". arXiv:1510.08983 [cs.NE]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  22. ^ Zayats, Vicky; Ostendorf, Mari; Hajishirzi, Hannaneh (2016). "Disfluency Detection using a Bidirectional LSTM". arXiv:1604.03209 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  23. ^ Liwicki, Marcus; Graves, Alex; Bunke, Horst; Schmidhuber, Jürgen (2007). "A novel approach to on-line handwriting recognition based on bidirectional long short-term memory networks" (PDF). 1. Proceedings of the 9th International Conference on Document Analysis and Recognition. Archived (PDF) from the original on 2019-07-07. Retrieved 2021-03-08. {{cite journal}}: Cite journal requires |journal= (help)
  24. ^ Huang, Zhiheng; Xu, Wei; Yu, Kai (2015). "Bidirectional LSTM-CRF Models for Sequence Tagging". arXiv:1508.01991 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  25. ^ Kiperwasser, Eliyahu; Goldberg, Yoav (2016). "Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations". Transactions of the Association for Computational Linguistics. 4: 313–327. arXiv:1603.04351. Bibcode:2016arXiv160304351K. doi:10.1162/tacl_a_00101. S2CID 1642392. Archived from the original on 2021-02-24. Retrieved 2021-03-08.
  26. ^ Zhang, He; Nan, Zhixiong; Yang, Tao; Liu, Yifan; Zheng, Nanning (2021). "A Driving Behavior Recognition Model with Bi-LSTM and Multi-Scale CNN". arXiv:2103.00801 [cs.CV]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  27. ^ Deligiannidis, Stavros; Mesaritakis, Charis; Bogris, Adonis (2021). "Performance and Complexity Analysis of Bi-Directional Recurrent Neural Network Models Versus Volterra Nonlinear Equalizers in Digital Coherent Systems". Journal of Lightwave Technology. 39 (18): 5791–5798. arXiv:2103.03832. doi:10.1109/JLT.2021.3092415. S2CID 232135347.
  28. ^ Oluwalade, Bolu; Neela, Sunil; Wawira, Judy; Adejumo, Tobiloba; Purkayastha, Saptarshi (2021). "Human Activity Recognition using Deep Learning Models on Smartphones and Smartwatches Sensor Data". arXiv:2103.03836 [eess.SP]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  29. ^ Dang, Dennis; Fabio Di Troia; Stamp, Mark (2021). "Malware Classification Using Long Short-Term Memory Models". arXiv:2103.02746 [cs.CR]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  30. ^ Lample, Guillaume; Ballesteros, Miguel; Subramanian, Sandeep; Kawakami, Kazuya; Dyer, Chris (2016). "Neural Architectures for Named Entity Recognition". arXiv:1603.01360 [cs.CL]. {{cite arXiv}}: Unknown parameter |url= ignored (help)
  31. ^ Wang, Cheng; Yang, Haojin; Bartz, Christian; Meinel, Christoph (2016). "Image Captioning with Deep Bidirectional LSTMS". arXiv:1604.00790 [cs.CV]. {{cite arXiv}}: Unknown parameter |url= ignored (help)

Cite error: A list-defined reference named "comparison" is not used in the content (see the help page).

User:JPxG/Draft[edit]

References[edit]

User:JPxG/Draft01[edit]

Generative Pre-trained Transformer
Original author(s)OpenAI[1]
Initial releaseJune 11, 2020 (beta)
TypeAutoregressive transformer language model
Websiteopenai.com/blog/openai-api

gpt-j


these sources kind of suck ass but we'll see what we can come up with

https://slate.com/technology/2022/08/4chan-ai-open-source-trolling.html https://www.marktechpost.com/2022/04/06/meet-gpt-neox-20b-a-20-billion-parameter-natural-language-processing-ai-model-open-sourced-by-eleutherai/ https://analyticsindiamag.com/write-an-essay-in-5-lines-of-code-using-gpt-neo/ https://www.infoq.com/news/2021/07/eleutherai-gpt-j/ https://www.marktechpost.com/2022/11/29/what-are-large-language-models-llms-applications-and-types-of-llms/ https://analyticsindiamag.com/meet-gpt-jt-the-closest-open-source-alternative-to-gpt-3/ https://mindmatters.ai/2022/04/why-gpt-3-cant-understand-anything/ https://www.theregister.com/2023/01/16/in_brief_ai/ https://www.law.com/legaltechnews/2023/01/31/tracking-generative-ai-how-evolving-ai-models-are-impacting-legal/?slreturn=20230031200957 https://www.theregister.com/2023/01/23/turnitin_chatgpt_detector/ https://insidebigdata.com/2023/01/18/originality-ai-allows-users-to-quickly-detect-ai-written-content-with-a-chrome-extension/ https://whatsnewinpublishing.com/ai-generated-content-this-browser-extension-will-spot-it/ https://www.vice.com/en/article/pkg94v/deepfake-voice-do-not-pay-wells-fargo-refund https://www.abajournal.com/news/article/ai-program-earned-passing-bar-exam-scores-on-evidence-and-torts-can-it-work-in-court https://analyticsindiamag.com/top-7-tools-for-detecting-ai-generated-content/ https://www.smithsonianmag.com/smart-news/the-first-ai-lawyer-will-help-defendants-fight-speeding-tickets-180981508/ https://mindmatters.ai/2023/01/is-chatgpt-solely-a-neural-network-i-tested-that/ https://venturebeat.com/ai/what-happens-to-an-llm-after-its-trained/

User:JPxG/Draft02[edit]

Blank [2] [3] [4] [5] [6] [7] [8] [9] [10] [11]

References[edit]

  1. ^ Cite error: The named reference neurips_Brown_202012 was invoked but never defined (see the help page).
  2. ^ a
  3. ^ b
  4. ^ c
  5. ^ d
  6. ^ e
  7. ^ f
  8. ^ g
  9. ^ h
  10. ^ i
  11. ^ j

User:JPxG/Draft03[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft04[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft05[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft06[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft07[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft08[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft09[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft11[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft12[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft13[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft14[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft15[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft16[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft17[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft18[edit]

Sand screw[edit]

Sand screws.[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20]

References[edit]

User:JPxG/Draft19[edit]

Flame Warriors[edit]

Posts.[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14]

References[edit]

  1. ^ "Seeking new Flame Warriors". www.theregister.com.
  2. ^ Baker, Bob (December 21, 2003). "Flamers and Their ALL-CAPS Screeds, You Know the Type" – via www.washingtonpost.com.
  3. ^ "Meet Furious Typer and Netiquette Nazi". The Irish Times.
  4. ^ Times, Bob Baker, Los Angeles. "BOORISH INTERNET BEHAVIOR DRAWS HIS IRE -- AND HIS WEB SITE". nydailynews.com.{{cite web}}: CS1 maint: multiple names: authors list (link)
  5. ^ "A flaming desire". Los Angeles Times. December 12, 2003.
  6. ^ "0929 issue of the Daily Journal | Paleolithic Diet | Violence". Scribd.
  7. ^ "Corel Painter 11 - Corel Painter Masters - Mike Reed". www.corel.com.
  8. ^ "Flame Warriors: Recognise Yourself? – Cybersoc".
  9. ^ "Vocabulario y términos que todo Community Manager debería conocer". PuroMarketing.
  10. ^ Stephen, Bijan (June 26, 2019). "Something Awful's founder thinks YouTube sucks at moderation". The Verge.
  11. ^ "Flame Warriors". September 8, 2015. Archived from the original on 2015-09-08.
  12. ^ "The Flame Warriors". The RiotACT.
  13. ^ https://radgeek.com/gt/2001/08/13/flame_warriors/
  14. ^ Edelmann, Noella (2013). "Reviewing the Definitions of "Lurkers" and Some Implications for Online Research". Cyberpsychology, Behavior, and Social Networking. 16 (9): 645–649. doi:10.1089/cyber.2012.0362. PMID 23848960.


/[1] /[2] /[3] /[4] /[5] /[6] /[7] /[8] /[9] /[10]

[11]

/[12] /[13] /[14] ?[15] ?[16]

           /[17]

mugs shirts /[18] /[19] /[20]


[21]

           probably irrelevant [22]
           probably irrelevant [23]
           probably irrelevant [24]
           probably irrelevant [25]
           probably irrelevant [26]
           probably irrelevant [27]
           probably irrelevant [28]
           probably irrelevant [29]
           probably irrelevant [30]
           probably irrelevant [31]
           probably irrelevant [32]
           probably irrelevant [33]


[34] [35] [36]

[37]


[38] [39] [40] [41] [42] [43] [44] [45] [46] [47]

-->

User:JPxG/Draft20[edit]

Pig poop balls[edit]

Pig poop balls.[48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62]

References[edit]

  1. ^ "Page 33". The News Journal. Wilmington, Delaware. 2004-01-06. p. 33.
  2. ^ "Page 34". The News Journal. Wilmington, Delaware. 2004-01-06. p. 34.
  3. ^ "Page 21". The Herald. Rock Hill, South Carolina. 2003-12-28. p. 21.
  4. ^ "Page 17". The Herald. Rock Hill, South Carolina. 2003-12-28. p. 17.
  5. ^ "Page E4". The Orlando Sentinel. Orlando, Florida. 2003-12-27. p. E4.
  6. ^ "Page 59". The Sacramento Bee. Sacramento, California. 2003-12-26. p. 59.
  7. ^ "Page 63". The Sacramento Bee. Sacramento, California. 2003-12-26. p. 63.
  8. ^ "Page E1". The Sacramento Bee. Sacramento, California. 2003-12-26. p. E1.
  9. ^ "Page E5". The Sacramento Bee. Sacramento, California. 2003-12-26. p. E5.
  10. ^ "Page 51". The News and Observer. Raleigh, North Carolina. 2003-12-24. p. 51.
  11. ^ "Page 33-46". Chicago Tribune. Chicago, Illinois. 2003-12-24. p. 33-46.
  12. ^ "Page F3". The News and Observer. Raleigh, North Carolina. 2003-12-24. p. F3.
  13. ^ "Page 29". Statesman Journal. Salem, Oregon. 2003-12-22. p. 29.
  14. ^ "Page 31". Longview Daily News. Longview, Washington. 2003-12-22. p. 31.
  15. ^ "Page 1-17". Chicago Tribune. Chicago, Illinois. 2003-12-20. p. 1-17.
  16. ^ "Page 1-23". Chicago Tribune. Chicago, Illinois. 2003-12-20. p. 1-23.
  17. ^ "Page 73". South Florida Sun Sentinel. Fort Lauderdale, Florida. 2003-12-20. p. 73.
  18. ^ "Page 80". South Florida Sun Sentinel. Fort Lauderdale, Florida. 2003-12-20. p. 80.
  19. ^ "Page 1-33". Chicago Tribune. Chicago, Illinois. 2003-12-20. p. 1-33.
  20. ^ "Page 1-39". Chicago Tribune. Chicago, Illinois. 2003-12-20. p. 1-39.
  21. ^ "Page 161". The Los Angeles Times. Los Angeles, California. 2003-12-12. p. 161.
  22. ^ "Page 281". Hartford Courant. Hartford, Connecticut. 2003-12-10. p. 281.
  23. ^ "Page 311". Hartford Courant. Hartford, Connecticut. 2003-12-10. p. 311.
  24. ^ "Page 317". Hartford Courant. Hartford, Connecticut. 2003-12-10. p. 317.
  25. ^ "Page 10". Calgary Herald. Calgary, Alberta, Canada. 2003-11-21. p. 10.
  26. ^ "Page 78". Calgary Herald. Calgary, Alberta, Canada. 2003-11-21. p. 78.
  27. ^ "Page 79". Calgary Herald. Calgary, Alberta, Canada. 2003-11-21. p. 79.
  28. ^ "Page 14". The Press-Tribune. Roseville, California. 2003-10-15. p. 14.
  29. ^ "Page E3". The Atlanta Constitution. Atlanta, Georgia. 2003-05-07. p. E3.
  30. ^ "Page 12". Hattiesburg American. Hattiesburg, Mississippi. 2003-05-06. p. 12.
  31. ^ "Page 48". The San Francisco Examiner. San Francisco, California. 2002-11-21. p. 48.
  32. ^ "Page 25". Rocky Mount Telegram. Rocky Mount, North Carolina. 2002-11-06. p. 25.
  33. ^ "Page 13". Globe-Gazette. Mason City, Iowa. 2002-08-12. p. 13.
  34. ^ "Page 37". Victoria Advocate. Victoria, Texas. 2002-07-05. p. 37.
  35. ^ "Page 53". The Pantagraph. Bloomington, Illinois. 2000-07-20. p. 53.
  36. ^ "Page 39". Star Tribune. Minneapolis, Minnesota. 2000-06-06. p. 39.
  37. ^ "Page 98". Newsday. New York, New York. 1995-08-08. p. 98.
  38. ^ "Page 60". The Charlotte Observer. Charlotte, North Carolina. 1995-05-11. p. 60.
  39. ^ "Page 159". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 159.
  40. ^ "Page 123". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 123.
  41. ^ "Page 175". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 175.
  42. ^ "Page 107". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 107.
  43. ^ "Page 115". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 115.
  44. ^ "Page 151". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 151.
  45. ^ "Page 149". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 149.
  46. ^ "Page 141". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 141.
  47. ^ "Page 131". Hartford Courant. Hartford, Connecticut. 1995-03-06. p. 131.
  48. ^ Read, Max. "How I Made $70 Selling Myself on Twitter". Gawker.
  49. ^ "Chris Matthews Is Just Another Crybully". jacobinmag.com.
  50. ^ "Elizabeth Warren's Plan to Combat Misinformation Could Help Ruin the Internet". www.vice.com.
  51. ^ "Pig Poops On Own Balls (Photo NSFW Because Pig Is Pooping On Its Own Giant Balls)". Deadspin.
  52. ^ "Doxxed: The Truth Behind Piggy Poop Balls". BuzzFeed News.
  53. ^ "Parler CEO Says He'll Ban Users for Posting Bad Words, Dicks, Boobs, or Poop". Gizmodo.
  54. ^ Staff, Deadspin. "This Is How The Deadspin Staff Is Voting". The Concourse.
  55. ^ "Reply Allpocalypse Ruins Deadspin Bracket Pool". Deadspin.
  56. ^ "The 13 Most Powerful Images of Naked Celebrities of 2012". web.archive.org. March 17, 2013.
  57. ^ "The 19 Most Powerful Images of 2012". web.archive.org. March 17, 2013.
  58. ^ "25 Years of "The Net," and the new age of techno-thrillers". July 28, 2020.
  59. ^ Wilson, Shania (July 2, 2021). "What is the pig poop balls meme? Old meme resurfaces amidst new social GETTR app". HITC.
  60. ^ Lord, Debbie. "Truth Social: Trump announces launch of social media site, media company". KIRO 7 News Seattle.
  61. ^ Binder, Matt (October 21, 2021). "Trolls swamped Trump's new social network 'TRUTH' before it even launched". Mashable.
  62. ^ Goforth, Claire (October 21, 2021). "Trump's new social media site collapses after trolls flood it before launch". The Daily Dot.

User:JPxG/Draft21[edit]

Blank [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

References[edit]

  1. ^ a
  2. ^ b
  3. ^ c
  4. ^ d
  5. ^ e
  6. ^ f
  7. ^ g
  8. ^ h
  9. ^ i
  10. ^ j

User:JPxG/Draft22[edit]

StickerWorld[edit]

[1] [2] [3] [4]

"launched february 1999", "2.6 million page views per month, 100k reg users", "a lightning bolt signifies an animated sticker, while a wrench means you can change the way a sticker looks on your page"
also at

[5]

[6] [7] [8] [9] [10] [11]

Number of unique visitors in February: 300,000.
Number of visitors aged 2-11: 45,000.

[12]

[13]

In January, the Family Workshop site added content aimed at an elementary-school audience, older than the Sesame Street crowd. The site's new Kid City lets children 6 to 12 years old create their own home pages and collect and trade on-line stickers.

User:JPxG/Draft23[edit]

Royal Dutch Shell Martinez Refinery
The refinery in May 2021
JPxG/bigindex is located in San Francisco Bay Area
JPxG/bigindex
Location of Royal Dutch Shell Martinez Refinery
JPxG/bigindex is located in California
JPxG/bigindex
JPxG/bigindex (California)
JPxG/bigindex is located in the United States
JPxG/bigindex
JPxG/bigindex (the United States)
CountryUnited States
StateCalifornia
CityMartinez
Coordinates38°00′47″N 122°06′21″W / 38.01315°N 122.1058°W / 38.01315; -122.1058
Refinery details
OperatorMartinez Refining Company
Owner(s)Royal Dutch Shell (1915–2020)
PBF Energy (2020–)
Commissioned1915 (1915)
Capacity157,000[14] bbl/d (25,000 m3/d)

It refines stuff.

Founded in 1915.[15][16][17] Was Shell's first US refinery.[16] The terminal was built in 1913 by the American Gasoline Company.[16] Address is 3485 Pacheco Blvd, Martinez, Ca, 94553.[16] In 2020 it was Shell's only refinery in California.[18]

Spilled 400,000 gallons of crude oil into the Carquinez Strait in 1988.[15] Part of a "handful" of environmental incidents.[15] In March 2019, Shell paid 165k to settle 16 air violations between 2015 and 2016.[17] In December 2016 they flared off almost 20 tons.[17] 73 flares between 2005 and 2018.[17] Pump fire in process unit on June 7 2019, workers evacuated. [17]

Gasoline is 85% of production.[16] Also makes "asphalt, diesel, jet turbine fuel, petroleum coke, propane, residual fuel oils, and sulfur".[16][17][19] In 2017 "the refinery has enjoyed a generally positive relationship with the city of Martinez over the years".[19]

Shell had been trying to sell it since 2016.[15] In 2021, Mercury News said that it would be affected by new rules (what are they?).[20] The costs would be "approximately 0.62% of estimated annually revenue".[20] PBF suggested $40 million project that would bring down particulate emissions.[21] It was PBF's second refinery on the West Coast.[18]

Located on 860-acre site.[14][22] 157,000 barrels per day.[14][18] Dual-coking refinery and integrated logistics.[14] Royal Dutch Shell PLC's subsidiary (Equilon Enterprises, doing business as Shell Oil Products US) sold to PBF Energy.[14] PBF owns it, the Martinez Refining Co. LLC (who they own) operates it.[15] They were "in talks" in 2017.[19] Sale completed in February 2020,[22] for $1.2 billion.[14][23][24][25] Cost of assets was $960.0 million, plus the value of the inventory.[22] Part of global downstream divestment from Shell.[14] Plans made to (more stuff about Shell's plans afterward, etc).[14] "Martinez’s on-site logistics assets, including a deep-water marine terminal, product distribution terminals, and refinery crude and product storage installations with about 8.8 million bbl of shell capacity."[14][24] "adjacent truck rack and terminal".[23] Has a Nelson complexity index of 16.1 ("one of the most complex refineries in the United States").[22][24][18] Proposed renewable diesel thingy with idled equipment.[23][24][18]

According to Dun & Bradstreet, annual revenue is 147.65 million.[16]

In 2019, Shell employed over 700 people at the site.[17] These employees were to be offered jobs at PBF when it took over.[17]

The freaking goddamn coronavirus happened in 2020. PBF said their refineries running at 30% capacity.[17] They sold five of the hydrogen plants nationwide, for a total of $530 mil.[26] Two of them were at the Martinez facility.[17] There are three hydrogen plants there, one had been owned by Air Products since 1996.[17] They separate the sulfur from the other shit.[17] They're steam methane reformer hydrogen production plants.[27]

In June 2021, PBF said that if new regulations went through, the Martinez plant would go kaput.[28] This was to do with fluidized catalytic cracking units (more info in source).[28]

Malfunctions in July 2018, health advisory issued in Martinez and Pacheco.[29][30] Flaring incident July 6, fire at compressor unit, >100 lb of hydrogen sulfide.[30] "five refinery problems over four days", >8500 lbs of gas. Lot of shit in this source.[30]

Shit got slow during the freaking coronavirus. Throughput at 30% below expectations in April 2020. Transitioned to idle operating status.[27]

Flaring incident in December 2016, "thousands of pounds of toxic gas" released. Caused by power outage. Decades-old substation. 39,000# of light hydrocarbons and hydrogen sulfide sent to flares on December 19 2016.[31][32]

More stuff here.[32]

And here.[33]

References[edit]

  1. ^ "Karen SIdeman". NYU | Game Center.
  2. ^ https://www.webvisionsevent.com/speaker/sideman-karen/
  3. ^ "NewKidCo Introduces Sesame Street Sports at E3". www.casinocitytimes.com.
  4. ^ "IQ Interactive Special Report: Just Kidding".
  5. ^ https://worldradiohistory.com/hd2/IDX-Business/Magazines/Archive-Mediaweek-IDX/IDX/00s/Mediaweek-2000-05-01-OCR-Page-0104.pdf
  6. ^ "Intel® WebOutfitter(SM) Service Brings New Internet Experiences To Owners Of Pentium® III Processor-Based PCs". www.intel.com.
  7. ^ http://www.wwwac.org/
  8. ^ "Sesame Workshop - Sticker World". web.archive.org. August 11, 2008.
  9. ^ "Sesame Workshop - Sticker World". web.archive.org. March 12, 2007.
  10. ^ "Sesame Workshop - Sticker World". web.archive.org. February 19, 2007.
  11. ^ "Web Sites Appealing to Children". archive.nytimes.com.
  12. ^ "Sesame Workshop Relaunches Sesame Street Website Relaunches". Writers Write.
  13. ^ Slatalla, Michelle (April 22, 1999). "'Sesame Street' Site: Serious Child's Play" – via NYTimes.com.
  14. ^ a b c d e f g h i "StackPath". www.ogj.com.
  15. ^ a b c d e "After 105 years, Martinez refinery no longer owned by Shell". Bay City News. February 1, 2020.
  16. ^ a b c d e f g https://www.dnb.com/business-directory/company-profiles.shell_martinez_refining_company.13dbbd3d18597ca781b6123f7c7a4b0a.html
  17. ^ a b c d e f g h i j k l "Shell to Sell Martinez Refinery for $1 Billion". KQED.
  18. ^ a b c d e "Shell Divests California Refinery". www.rigzone.com.
  19. ^ a b c "Report: Sale of Shell Martinez refinery progressing". March 24, 2017.
  20. ^ a b "Editorial: Bay Area refinery rules would improve environment and health". June 18, 2021.
  21. ^ "Opinion: Bay Area air board can reduce emissions without killing jobs". July 3, 2021.
  22. ^ a b c d "PBF Energy Completes Acquisition of Martinez Refinery, Creates West Coast System". investors.pbfenergy.com.
  23. ^ a b c "Shell Finalizes Martinez Refinery Sale". CStore Decisions. February 4, 2020.
  24. ^ a b c d "Shell unit Equilon Enterprises closes $1.2bn sale of Martinez Refinery". www.hydrocarbons-technology.com.
  25. ^ "Shell's Largest Refinery Reduces Crude Processing Capacity By 50%". OilPrice.com.
  26. ^ "Coronavirus Crisis Prompts Martinez Refinery to Cut Back, Sell Hydrogen Plants". KQED.
  27. ^ a b Weilenman, Donna Beth (April 20, 2020). "Martinez refineries adjusting to coronavirus crisis".
  28. ^ a b "Air Regulators Weigh Plan Aimed at Dramatically Cutting Bay Area Refinery Pollution". KQED.
  29. ^ "Health Advisory Lifted for Martinez, Pacheco After Shell Refinery 'Shutdown'". KQED.
  30. ^ a b c "Malfunctions at Shell's Martinez Refinery More Serious Than First Reported". KQED.
  31. ^ "Shell Not Revealing Full List of Gases Released in December Martinez Refinery Flares". KQED.
  32. ^ a b "Shell's Martinez Refinery Sent Close to 20 Tons of Gas to its Flares During Monday Outage". KQED.
  33. ^ "Monroe Spaght, former trustee and retired Shell Oil executive, dies at 83". news.stanford.edu.

User:JPxG/Draft24[edit]

Flying Caduceus[edit]

Flying Caduceus.

References[edit]

Cite error: A list-defined reference has no name (see the help page).

User:JPxG/Draft25[edit]

The station in 2021

Wheeler Ridge Compressor Station[edit]

The epic freaking Wheeler Compressor Station.[1] [2] [3] [4] [5] [6] [7]

References[edit]

User:JPxG/Draft26[edit]

User:JPxG/Draft27[edit]

Developer(s)Anuke

Mindustry is a real-time strategy, factory management, and tower defense game developed and published by Anuken under the libre GNU Public License.[1][2] It is available for Windows, MacOS, Linux, Android and iOS.[3]


[4] [2] [5] [6] [7] [8] [9]

References[edit]

  1. ^ "Mindustry". mindustrygame.github.io.
  2. ^ a b Bolding, Jonathan (December 5, 2020). "The factory-building tower defense of Mindustry gets a huge 6.0 update" – via www.pcgamer.com.
  3. ^ https://anuke.itch.io/mindustry
  4. ^ Sykes, Tom (February 24, 2020). "Do you like Factorio and Mindustry? Play free production game DeFacto" – via www.pcgamer.com.
  5. ^ Ganapathi, Anusha. "Mindustry mind the details". The New Indian Express.
  6. ^ Lagace, Marc (January 24, 2020). "Mindustry is the ultimate sandbox tower defense game for Android [Game of the week]". Android Central.
  7. ^ Peeples, Jeremy (July 27, 2022). "Fanatical Build Your Own Triple Pack Bundle Now Available - Hardcore Gamer". hardcoregamer.com.
  8. ^ Dawe, Liam (November 15, 2022). "Mixing factory management, Tower Defense and RTS - Mindustry 7.0 is out now". GamingOnLinux.
  9. ^ https://gamerant.com/best-factory-simulation-games/

User:JPxG/Draft2[edit]

References[edit]

User:JPxG/Draft3[edit]

References[edit]

User:JPxG/Draft4[edit]

backpack girl[1] [2] [3] [4] [5] [6] [7] [8]

References[edit]

Cite error: A list-defined reference named "dmngn2" is not used in the content (see the help page).