User:JPxG/bigindex
Drafts and sandbice
|
---|
|
User:JPxG/Allylurea[edit]
Drafts: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · 24 · 25 · 26 · 27 · 28 · 29 ·2 · 3 · 4
Sandbice: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 66 · 88 · 89 · 98 · 99
Allylurea is a chemical compound which is most known for User:JPxG being too lazy to ever get around to writing an article about it. |
User:JPxG/Bi-LSTM[edit]
Drafts: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · 24 · 25 · 26 · 27 · 28 · 29 ·2 · 3 · 4
Sandbice: * · 01 · 02 · 03 · 04 · 05 · 06 · 07 · 08 · 09 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 66 · 88 · 89 · 98 · 99
Bidirectional long short-term memory is an artificial neural network architecture used for deep learning. Background[edit]Since the origin of computing, artificial intelligence has been an object of study, but during the second half of the 20th century, processing power became more easily accessible and computer-based research became more commonplace. The term "machine learning", used as early as 1959 by IBM researcher Arthur Samuel,[1] currently encompasses a broad variety of statistical learning, data science and neural network approaches to computational problems (often falling under the aegis of artificial intelligence). The first neural network, the perceptron, was introduced in 1957 by Frank Rosenblatt.[2] This machine attempted to recognize pictures and classify them into categories. It consisted of a network of "input neurons" and "output neurons"; each input neuron was connected to every single output neuron, with "weights" (set with potentiometers) determining the strength of each connection's influence on output.[3] The architecture of Rosenblatt's perceptron is what would now be referred to as a fully-connected single-layer feed-forward neural network (FFNN). Since then, many different innovations have occurred, the most significant being the development of deep learning models in which one or more "layers" of neurons exists between the input and output.[4][5] Neural networks are typically initialized with random weights, and "trained" to give consistently correct output for a known dataset (the "training set") using backpropagation to perform gradient descent, in which a system of equations is used to determine the optimal adjustment of all weights in the entire network for a given input/output example.[5][4] In traditional feed-forward neural networks (like Rosenblatt's perceptron), each layer processes output from the previous layer only. Information does not flow backwards, which means that its structure contains no "cycles".[4] In contrast, a recurrent neural network (RNN) has at least one "cycle" of activation flow, where neurons can be activated by neurons in subsequent layers.[4] RNNs, unlike FFNNs, are suited to processing sequential data, since they are capable of encoding different weights (and producing different output) for the same input based on previous activation states. That is to say, a text-prediction model using recurrence could process the string "The dog ran out of the house, down the street, loudly" and produce "barking", while producing "meowing" for the same input sequence featuring "cat" in the place of "dog". Achieving the same output from a purely feed-forward neural network, on the other hand, would require separate activation pathways to be trained for both sentences in their entirety.[6][7] However, RNNs and FFNNs are both vulnerable to the "vanishing gradient problem"; since gradients (stored as numbers of finite precision) must be backpropagated over every layer of a model to train it, a model with a large number of layers tends to see gradients "vanish" to zero or "explode" to infinity before getting all the way across. To resolve this problem, long short-term memory (LSTM) models were introduced by Sepp Hochreiter and Jürgen Schmidhuber in 1995—1997, featuring a novel architecture of multiple distinct "cells" with "input", "output" and "forget" gates.[8][9][10]. LSTMs would find use in a variety of tasks that RNNs performed poorly at, like learning fine distinctions between rhythmic pattern sequences.[11] While LSTMs proved useful for a variety of applications, like handwriting recognition,[12] they remained limited in their ability to process context; a unidirectional RNN or LSTM's output can only be influenced by previous sequence items.[6] Similar to how the history of the Roman Empire is contextualized by its decline, earlier items in a sequence of images or words tend to take on different meanings based on later items. One example is the following sentence:Here, the "bird" is being used as a slang term for an airplane, but this only becomes apparent upon parsing the last word ("propeller"). While a human reading this sentence can update their interpretation of the first part after reading the second, a unidirectional neural network (whether feedforward, recurrent, or LSTM) cannot.[6] To provide this capability, bidirectional LSTMs were created. Bidirectional RNNs were first described in 1997 by Schuster and Paliwal as an extension of RNNs.[13] NLP crap[edit]Bidirectional algorithms have long been used in domains outside of deep learning; in 2011, the state of the art in part-of-speech (POS) tagging classifiers consisted of classifiers trained on windows of text which then fed into bidirectional decoding algorithms during inference; Collobert et al. cited examples of high-performance POS tagging systems whose decoding systems' bidirectionality was instantiated in dependency networks and Viterbi decoders.[14]
2015, Ling et al, compositional character models for open vocabulary word representation[16] 2015, Kawakami et al, representing words in context with multilingual supervision[17] 2016, Li et al, sentence relation modeling with auxiliary character-level embedding[18] Speech / handwriting[edit]In a 2005 paper, Graves et al. used bidirectional LSTMs for improved phoneme classification and recognition.[19] In a 2013 paper, Graves et al. used deep bidirectional LSTM for hybrid speech recognition.[20] 2014, Zhang et al, distant speech recognition using Highway LSTM RNNs[21] 2016, Zayats et al, disfluency detection with bi-LSTM[22]
Sequence tagging[edit]2015, Huang et al, bi-LSTM-CRFs for sequence tagging[24] Else[edit]2016, Kiperwasser et al., dependency parsing using bi-LSTM feature representations[25] 2016, Zhang et al., driving behavior recognition model with multi-scale cnn and bi-lstm[26] 2021, Deligiannidis et al. analyzed performance versus complexity of bi-RNNs versus volterra nonlinear equalizers in digital coherent systems.[27] 2021, Oluwalade et al, human activity recognition using smartphone and smartwatch sensor data[28] 2021, Dang et al, lstm models for malware classification (are they bidirectional?)[29] 2016, Lample et al, neural architectures for named entity recognition[30] 2016, Wang et al, image captioning with bi-LSTM[31]
References[edit]
Cite error: A list-defined reference named "comparison" is not used in the content (see the help page). |
User:JPxG/Draft[edit]
References[edit] |
User:JPxG/Draft01[edit]
User:JPxG/Draft02[edit]
User:JPxG/Draft03[edit]
User:JPxG/Draft04[edit]
User:JPxG/Draft05[edit]
User:JPxG/Draft06[edit]
User:JPxG/Draft07[edit]
User:JPxG/Draft08[edit]
User:JPxG/Draft09[edit]
User:JPxG/Draft11[edit]
User:JPxG/Draft12[edit]
User:JPxG/Draft13[edit]
User:JPxG/Draft14[edit]
User:JPxG/Draft15[edit]
User:JPxG/Draft16[edit]
User:JPxG/Draft17[edit]
User:JPxG/Draft18[edit]
Sand screw[edit]Sand screws.[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] References[edit]
|
User:JPxG/Draft19[edit]
Flame Warriors[edit]Posts.[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] References[edit]
/[17] probably irrelevant [22] probably irrelevant [23] probably irrelevant [24] probably irrelevant [25] probably irrelevant [26] probably irrelevant [27] probably irrelevant [28] probably irrelevant [29] probably irrelevant [30] probably irrelevant [31] probably irrelevant [32] probably irrelevant [33]
--> |
User:JPxG/Draft20[edit]
Pig poop balls[edit]Pig poop balls.[48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] References[edit]
|
User:JPxG/Draft21[edit]
User:JPxG/Draft22[edit]
StickerWorld[edit]
|
User:JPxG/Draft23[edit]
It refines stuff. Founded in 1915.[15][16][17] Was Shell's first US refinery.[16] The terminal was built in 1913 by the American Gasoline Company.[16] Address is 3485 Pacheco Blvd, Martinez, Ca, 94553.[16] In 2020 it was Shell's only refinery in California.[18] Spilled 400,000 gallons of crude oil into the Carquinez Strait in 1988.[15] Part of a "handful" of environmental incidents.[15] In March 2019, Shell paid 165k to settle 16 air violations between 2015 and 2016.[17] In December 2016 they flared off almost 20 tons.[17] 73 flares between 2005 and 2018.[17] Pump fire in process unit on June 7 2019, workers evacuated. [17] Gasoline is 85% of production.[16] Also makes "asphalt, diesel, jet turbine fuel, petroleum coke, propane, residual fuel oils, and sulfur".[16][17][19] In 2017 "the refinery has enjoyed a generally positive relationship with the city of Martinez over the years".[19] Shell had been trying to sell it since 2016.[15] In 2021, Mercury News said that it would be affected by new rules (what are they?).[20] The costs would be "approximately 0.62% of estimated annually revenue".[20] PBF suggested $40 million project that would bring down particulate emissions.[21] It was PBF's second refinery on the West Coast.[18] Located on 860-acre site.[14][22] 157,000 barrels per day.[14][18] Dual-coking refinery and integrated logistics.[14] Royal Dutch Shell PLC's subsidiary (Equilon Enterprises, doing business as Shell Oil Products US) sold to PBF Energy.[14] PBF owns it, the Martinez Refining Co. LLC (who they own) operates it.[15] They were "in talks" in 2017.[19] Sale completed in February 2020,[22] for $1.2 billion.[14][23][24][25] Cost of assets was $960.0 million, plus the value of the inventory.[22] Part of global downstream divestment from Shell.[14] Plans made to (more stuff about Shell's plans afterward, etc).[14] "Martinez’s on-site logistics assets, including a deep-water marine terminal, product distribution terminals, and refinery crude and product storage installations with about 8.8 million bbl of shell capacity."[14][24] "adjacent truck rack and terminal".[23] Has a Nelson complexity index of 16.1 ("one of the most complex refineries in the United States").[22][24][18] Proposed renewable diesel thingy with idled equipment.[23][24][18] According to Dun & Bradstreet, annual revenue is 147.65 million.[16] In 2019, Shell employed over 700 people at the site.[17] These employees were to be offered jobs at PBF when it took over.[17] The freaking goddamn coronavirus happened in 2020. PBF said their refineries running at 30% capacity.[17] They sold five of the hydrogen plants nationwide, for a total of $530 mil.[26] Two of them were at the Martinez facility.[17] There are three hydrogen plants there, one had been owned by Air Products since 1996.[17] They separate the sulfur from the other shit.[17] They're steam methane reformer hydrogen production plants.[27] In June 2021, PBF said that if new regulations went through, the Martinez plant would go kaput.[28] This was to do with fluidized catalytic cracking units (more info in source).[28] Malfunctions in July 2018, health advisory issued in Martinez and Pacheco.[29][30] Flaring incident July 6, fire at compressor unit, >100 lb of hydrogen sulfide.[30] "five refinery problems over four days", >8500 lbs of gas. Lot of shit in this source.[30] Shit got slow during the freaking coronavirus. Throughput at 30% below expectations in April 2020. Transitioned to idle operating status.[27] Flaring incident in December 2016, "thousands of pounds of toxic gas" released. Caused by power outage. Decades-old substation. 39,000# of light hydrocarbons and hydrogen sulfide sent to flares on December 19 2016.[31][32] More stuff here.[32] And here.[33] References[edit]
|
User:JPxG/Draft24[edit]
Flying Caduceus[edit]Flying Caduceus. References[edit]
Cite error: A list-defined reference has no name (see the help page).
|
User:JPxG/Draft25[edit]
Wheeler Ridge Compressor Station[edit]The epic freaking Wheeler Compressor Station.[1] [2] [3] [4] [5] [6] [7] References[edit]
|
User:JPxG/Draft26[edit]
Baneposting[edit]For you.[1] [2] [3] [4] [5] [6] References[edit]
|
User:JPxG/Draft27[edit]
Mindustry is a real-time strategy, factory management, and tower defense game developed and published by Anuken under the libre GNU Public License.[1][2] It is available for Windows, MacOS, Linux, Android and iOS.[3] References[edit]
|
User:JPxG/Draft2[edit]
References[edit] |
User:JPxG/Draft3[edit]
References[edit] |
User:JPxG/Draft4[edit]
backpack girl[1] [2] [3] [4] [5] [6] [7] [8] References[edit]
|