<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>http://www.colloquiam.com/wd/index.php?action=history&amp;feed=atom&amp;title=Wu_et_al_2020a</id>
		<title>Wu et al 2020a - Revision history</title>
		<link rel="self" type="application/atom+xml" href="http://www.colloquiam.com/wd/index.php?action=history&amp;feed=atom&amp;title=Wu_et_al_2020a"/>
		<link rel="alternate" type="text/html" href="http://www.colloquiam.com/wd/index.php?title=Wu_et_al_2020a&amp;action=history"/>
		<updated>2026-05-11T06:49:54Z</updated>
		<subtitle>Revision history for this page on the wiki</subtitle>
		<generator>MediaWiki 1.27.0-wmf.10</generator>

	<entry>
		<id>http://www.colloquiam.com/wd/index.php?title=Wu_et_al_2020a&amp;diff=214140&amp;oldid=prev</id>
		<title>Scipediacontent: Scipediacontent moved page Draft Content 591095309 to Wu et al 2020a</title>
		<link rel="alternate" type="text/html" href="http://www.colloquiam.com/wd/index.php?title=Wu_et_al_2020a&amp;diff=214140&amp;oldid=prev"/>
				<updated>2021-02-15T09:45:57Z</updated>
		
		<summary type="html">&lt;p&gt;Scipediacontent moved page &lt;a href=&quot;/public/Draft_Content_591095309&quot; class=&quot;mw-redirect&quot; title=&quot;Draft Content 591095309&quot;&gt;Draft Content 591095309&lt;/a&gt; to &lt;a href=&quot;/public/Wu_et_al_2020a&quot; title=&quot;Wu et al 2020a&quot;&gt;Wu et al 2020a&lt;/a&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr style='vertical-align: top;' lang='en'&gt;
				&lt;td colspan='1' style=&quot;background-color: white; color:black; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan='1' style=&quot;background-color: white; color:black; text-align: center;&quot;&gt;Revision as of 09:45, 15 February 2021&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan='2' style='text-align: center;' lang='en'&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(No difference)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Scipediacontent</name></author>	</entry>

	<entry>
		<id>http://www.colloquiam.com/wd/index.php?title=Wu_et_al_2020a&amp;diff=214139&amp;oldid=prev</id>
		<title>Scipediacontent: Created page with &quot; == Abstract ==  Short-term traffic speed prediction is a promising research topic in intelligent transportation systems (ITSs), which also plays an important role in the real...&quot;</title>
		<link rel="alternate" type="text/html" href="http://www.colloquiam.com/wd/index.php?title=Wu_et_al_2020a&amp;diff=214139&amp;oldid=prev"/>
				<updated>2021-02-15T09:45:55Z</updated>
		
		<summary type="html">&lt;p&gt;Created page with &amp;quot; == Abstract ==  Short-term traffic speed prediction is a promising research topic in intelligent transportation systems (ITSs), which also plays an important role in the real...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
Short-term traffic speed prediction is a promising research topic in intelligent transportation systems (ITSs), which also plays an important role in the real-time decision-making of traffic control and guidance systems. However, the urban traffic speed has strong temporal, spatial correlation and the characteristic of complex nonlinearity and randomness, which makes it challenging to accurately and efficiently forecast short-term traffic speeds. We investigate the relevant literature and found that although most methods can achieve good prediction performance with the complete sample data, when there is a certain missing rate in the database, it is difficult to maintain accuracy with these methods. Recent studies have shown that deep learning methods, especially long short-term memory (LSTM) models, have good results in short-term traffic flow prediction. Furthermore, the attention mechanism can properly assign weights to distinguish the importance of traffic time sequences, thereby further improving the computational efficiency of the prediction model. Therefore, we propose a framework for short-term traffic speed prediction, including data preprocessing module and short-term traffic prediction module. In the data preprocessing module, the missing traffic data are repaired to provide a complete dataset for subsequent prediction. In the prediction module, a combined deep learning method that is an attention-based LSTM (ATT-LSTM) model for predicting short-term traffic speed on urban roads is proposed. The proposed framework was applied to the urban road network in Nanshan District, Shenzhen, Guangdong Province, China, with a 30-day traffic speed dataset (floating car data) used as the experimental sample. Results show that the proposed method outperforms other deep learning algorithms (such as recurrent neural network (RNN) and convolutional neural network (CNN)) in terms of both calculating efficiency and prediction accuracy. The attention mechanism can significantly reduce the error of the LSTM model (up to 12.4%) and improves the prediction performance.&lt;br /&gt;
&lt;br /&gt;
Document type: Article&lt;br /&gt;
&lt;br /&gt;
== Full document ==&lt;br /&gt;
&amp;lt;pdf&amp;gt;Media:Draft_Content_591095309-beopen795-7358-document.pdf&amp;lt;/pdf&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Original document ==&lt;br /&gt;
&lt;br /&gt;
The different versions of the original document can be found in:&lt;br /&gt;
&lt;br /&gt;
* [http://dx.doi.org/10.1155/2020/8863724 http://dx.doi.org/10.1155/2020/8863724] under the license https://creativecommons.org/licenses/by/4.0/&lt;br /&gt;
&lt;br /&gt;
* [http://downloads.hindawi.com/journals/jat/2020/8863724.pdf http://downloads.hindawi.com/journals/jat/2020/8863724.pdf],&lt;br /&gt;
: [http://downloads.hindawi.com/journals/jat/2020/8863724.xml http://downloads.hindawi.com/journals/jat/2020/8863724.xml],&lt;br /&gt;
: [http://dx.doi.org/10.1155/2020/8863724 http://dx.doi.org/10.1155/2020/8863724]&lt;br /&gt;
&lt;br /&gt;
 under the license https://creativecommons.org/licenses/by/4.0/&lt;/div&gt;</summary>
		<author><name>Scipediacontent</name></author>	</entry>

	</feed>