美女扒开腿免费视频_蜜桃传媒一区二区亚洲av_先锋影音av在线_少妇一级淫片免费放播放_日本泡妞xxxx免费视频软件_一色道久久88加勒比一_熟女少妇一区二区三区_老司机免费视频_潘金莲一级黄色片_精品国产精品国产精品_黑人巨大猛交丰满少妇

代做COMP9414、代寫(xiě)C++,Java程序語(yǔ)言

時(shí)間:2024-06-20  來(lái)源:  作者: 我要糾錯(cuò)



COMP9414 24T2
Artificial Intelligence
Assignment 1 - Artificial neural networks
Due: Week 5, Wednesday, 26 June 2024, 11:55 PM.
1 Problem context
Time Series Air Quality Prediction with Neural Networks: In this
assignment, you will delve into the realm of time series prediction using neural
network architectures. You will explore both classification and estimation
tasks using a publicly available dataset.
You will be provided with a dataset named “Air Quality,” [1] available
on the UCI Machine Learning Repository 1. We tailored this dataset for this
assignment and made some modifications. Therefore, please only use the
attached dataset for this assignment.
The given dataset contains 8,358 instances of hourly averaged responses
from an array of five metal oxide chemical sensors embedded in an air qual-
ity chemical multisensor device. The device was located in the field in a
significantly polluted area at road level within an Italian city. Data were
recorded from March 2004 to February 2005 (one year), representing the
longest freely available recordings of on-field deployed air quality chemical
sensor device responses. Ground truth hourly averaged concentrations for
carbon monoxide, non-methane hydrocarbons, benzene, total nitrogen ox-
ides, and nitrogen dioxide among other variables were provided by a co-
located reference-certified analyser. The variables included in the dataset
1https://archive.ics.uci.edu/dataset/360/air+quality
1
are listed in Table 1. Missing values within the dataset are tagged
with -200 value.
Table 1: Variables within the dataset.
Variable Meaning
CO(GT) True hourly averaged concentration of carbon monoxide
PT08.S1(CO) Hourly averaged sensor response
NMHC(GT) True hourly averaged overall Non Metanic HydroCar-
bons concentration
C6H6(GT) True hourly averaged Benzene concentration
PT08.S2(NMHC) Hourly averaged sensor response
NOx(GT) True hourly averaged NOx concentration
PT08.S3(NOx) Hourly averaged sensor response
NO2(GT) True hourly averaged NO2 concentration
PT08.S4(NO2) Hourly averaged sensor response
PT08.S5(O3) Hourly averaged sensor response
T Temperature
RH Relative Humidity
AH Absolute Humidity
2 Activities
This assignment focuses on two main objectives:
? Classification Task: You should develop a neural network that can
predict whether the concentration of Carbon Monoxide (CO) exceeds
a certain threshold – the mean of CO(GT) values – based on historical
air quality data. This task involves binary classification, where your
model learns to classify instances into two categories: above or below
the threshold. To determine the threshold, you must first calculate
the mean value for CO(GT), excluding unknown data (missing values).
Then, use this threshold to predict whether the value predicted by your
network is above or below it. You are free to choose and design your
own network, and there are no limitations on its structure. However,
your network should be capable of handling missing values.
2
? Regression Task: You should develop a neural network that can pre-
dict the concentration of Nitrogen Oxides (NOx) based on other air
quality features. This task involves estimating a continuous numeri-
cal value (NOx concentration) from the input features using regression
techniques. You are free to choose and design your own network and
there is no limitation on that, however, your model should be able to
deal with missing values.
In summary, the classification task aims to divide instances into two cat-
egories (exceeding or not exceeding CO(GT) threshold), while the regression
task aims to predict a continuous numerical value (NOx concentration).
2.1 Data preprocessing
It is expected you analyse the provided data and perform any required pre-
processing. Some of the tasks during preprocessing might include the ones
shown below; however, not all of them are necessary and you should evaluate
each of them against the results obtained.
(a) Identify variation range for input and output variables.
(b) Plot each variable to observe the overall behaviour of the process.
(c) In case outliers or missing data are detected correct the data accord-
ingly.
(d) Split the data for training and testing.
2.2 Design of the neural network
You should select and design neural architectures for addressing both the
classification and regression problem described above. In each case, consider
the following steps:
(a) Design the network and decide the number of layers, units, and their
respective activation functions.
(b) Remember it’s recommended your network accomplish the maximal
number of parameters Nw < (number of samples)/10.
(c) Create the neural network using Keras and TensorFlow.
3
2.3 Training
In this section, you have to train your proposed neural network. Consider
the following steps:
(a) Decide the training parameters such as loss function, optimizer, batch
size, learning rate, and episodes.
(b) Train the neural model and verify the loss values during the process.
(c) Verify possible overfitting problems.
2.4 Validating the neural model
Assess your results plotting training results and the network response for the
test inputs against the test targets. Compute error indexes to complement
the visual analysis.
(a) For the classification task, draw two different plots to illustrate your
results over different epochs. In the first plot, show the training and
validation loss over the epochs. In the second plot, show the training
and validation accuracy over the epochs. For example, Figure 1 and
Figure 2 show loss and classification accuracy plots for 100 epochs,
respectively.
Figure 1: Loss plot for the classifica-
tion task
Figure 2: Accuracy plot for the clas-
sification task
4
(b) For the classification task, compute a confusion matrix 2 including True
Positive (TP), True Negative (TN), False Positive (FP), and False Neg-
ative (FN), as shown in Table 2. Moreover, report accuracy and pre-
cision for your test data and mention the number of tested samples as
shown in Table 3 (the numbers shown in both tables are randomly cho-
sen and may not be consistent with each other). For instance, Sklearn
library offers a various range of metric functions 3, including confusion
matrix 4, accuracy, and precision. You can use Sklearn in-built met-
ric functions to calculate the mentioned metrics or develop your own
functions.
Table 2: Confusion matrix for the test data for the classification task.
Confusion Matrix Positive (Actual) Negative (Actual)
Positive (Predicted) 103 6
Negative (Predicted) 6 75
Table 3: Accuracy and precision for the test data for the classification task.
Accuracy Precision Number of Samples
CO(GT) classification 63% 60% 190
(c) For the regression task, draw two different plots to illustrate your re-
sults. In the first plot, show how the selected loss function varies for
both the training and validation through the epochs. In the second
plot, show the final estimation results for the validation test. For in-
stance, Figure 3 and Figure 4 show the loss function and the network
outputs vs the actual NOx(GT) values for a validation test, respec-
tively. In Figure 4 no data preprocessing has been performed, however,
as mentioned above, it is expected you include this in your assignment.
(d) For the regression task, report performance indexes including the Root
Mean Squared Error (RMSE), Mean Absolute Error (MAE) (see a
discussion on [2]), and the number of samples for your estimation of
2https://en.wikipedia.org/wiki/Confusion matrix
3https://scikit-learn.org/stable/api/sklearn.metrics.html
4https://scikitlearn.org/stable/modules/generated/sklearn.metrics.confusion matrix.html
5
Figure 3: Loss plot for the re-
gression task.
Figure 4: Estimated and actual NOx(GT)
for the validation set.
NOx(GT) values in a table. Root Mean Squared Error (RMSE) mea-
sures the differences between the observed values and predicted ones
and is defined as follows:
RMSE =

1
n
Σi=ni=1 (Yi ? Y?i)2, (1)
where n is the number of our samples, Yi is the actual label and Y?i
is the predicted value. In the same way, MAE can be defined as the
absolute average of errors as follows:
MAE =
1
n
Σi=ni=1 |Yi ? Y?i|. (2)
Table 4 shows an example of the performance indexes (all numbers are
randomly chosen and may not be consistent with each other). As men-
tioned before, Sklearn library offers a various range of metric functions,
including RMSE5 and MAE 6. You can use Sklearn in-built metric func-
tions to calculate the mentioned metrics or develop your own functions.
Table 4: Result table for the test data for the regression task.
RMSE MAE Number of Samples
90.60 50.35 55
5https://scikit-learn.org/stable/modules/generated/sklearn.metrics.root mean squared error.html
6https://scikit-learn.org/stable/modules/generated/sklearn.metrics.mean absolute error.html
6
3 Testing and discussing your code
As part of the assignment evaluation, your code will be tested by tutors along
with you in a discussion session carried out in the tutorial session in week 6.
The assignment has a total of 25 marks. The discussion is mandatory and,
therefore, we will not mark any assignment not discussed with tutors.
You are expected to propose and build neural models for classification
and regression tasks. The minimal output we expect to see are the results
mentioned above in Section 2.4. You will receive marks for each of these
subsection as shown in Table 5, i.e. 7 marks in total. However, it’s fine if
you want to include any other outcome to highlight particular aspects when
testing and discussing your code with your tutor.
For marking your results, you should be prepared to simulate your neural
model with a generalisation set we have saved apart for that purpose. You
must anticipate this by including in your submission a script ready to open
a file (with the same characteristics as the given dataset but with fewer data
points), simulate the network, and perform all the validation tests described
in Section 2.4 (b) and (d) (accuracy, precision, RMSE, MAE). It is recom-
mended to save all of your hyper-parameters and weights (your model in
general) so you can call your network and perform the analysis later in your
discussion session.
As for the classification task, you need to compute accuracy and precision,
while for the regression task RMSE and MAE using the generalisation set.
You will receive 3 marks for each task, given successful results. Expected
results should be as follows:
? For the classification task, your network should achieve at least 85%
accuracy and precision. Accuracy and precision lower than that will
result in a score of 0 marks for that specific section.
? For the regression task, it is expected to achieve an RMSE of at most
280 and an MAE of 220 for unseen data points. Errors higher than the
mentioned values will be marked as 0 marks.
Finally, you will receive 1 mark for code readability for each task, and
your tutor will also give you a maximum of 5 marks for each task depending
on the level of code understanding as follows: 5. Outstanding, 4. Great,
3. Fair, 2. Low, 1. Deficient, 0. No answer.
7
Table 5: Marks for each task.
Task Marks
Results obtained with given dataset
Loss and accuracy plots for classification task 2 marks
Confusion matrix and accuracy and precision tables for classifi-
cation task
2 marks
Loss and estimated NOx(GT) plots for regression task 2 marks
Performance indexes table for regression task 1 mark
Results obtained with generalisation dataset
Accuracy and precision for classification task 3 marks
RMSE and MAE for regression task 3 marks
Code understanding and discussion
Code readability for classification task 1 mark
Code readability for regression task 1 mark
Code understanding and discussion for classification task 5 mark
Code understanding and discussion for regression task 5 mark
Total marks 25 marks
4 Submitting your assignment
The assignment must be done individually. You must submit your assignment
solution by Moodle. This will consist of a single .ipynb Jupyter file. This file
should contain all the necessary code for reading files, data preprocessing,
network architecture, and result evaluations. Additionally, your file should
include short text descriptions to help markers better understand your code.
Please be mindful that providing clean and easy-to-read code is a part of
your assignment.
Please indicate your full name and your zID at the top of the file as a
comment. You can submit as many times as you like before the deadline –
later submissions overwrite earlier ones. After submitting your file a good
practice is to take a screenshot of it for future reference.
Late submission penalty: UNSW has a standard late submission
penalty of 5% per day from your mark, capped at five days from the as-
sessment deadline, after that students cannot submit the assignment.
8
5 Deadline and questions
Deadline: Week 5, Wednesday 26 June of June 2024, 11:55pm. Please
use the forum on Moodle to ask questions related to the project. We will
prioritise questions asked in the forum. However, you should not share your
code to avoid making it public and possible plagiarism. If that’s the case,
use the course email cs9414@cse.unsw.edu.au as alternative.
Although we try to answer questions as quickly as possible, we might take
up to 1 or 2 business days to reply, therefore, last-moment questions might
not be answered timely.
6 Plagiarism policy
Your program must be entirely your own work. Plagiarism detection software
might be used to compare submissions pairwise (including submissions for
any similar projects from previous years) and serious penalties will be applied,
particularly in the case of repeat offences.
Do not copy from others. Do not allow anyone to see your code.
Please refer to the UNSW Policy on Academic Honesty and Plagiarism if you
require further clarification on this matter.
References
[1] De Vito, S., Massera, E., Piga, M., Martinotto, L. and Di Francia, G.,
2008. On field calibration of an electronic nose for benzene estimation in an
urban pollution monitoring scenario. Sensors and Actuators B: Chemical,
129(2), pp.750-757.
[2] Hodson, T. O. 2022. Root mean square error (RMSE) or mean absolute
error (MAE): When to use them or not. Geoscientific Model Development
Discussions, 2022, 1-10.

請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp













 

標(biāo)簽:

掃一掃在手機(jī)打開(kāi)當(dāng)前頁(yè)
  • 上一篇:代寫(xiě)指標(biāo)編寫(xiě) 編寫(xiě)同花順指標(biāo)公式 代編公式
  • 下一篇:ECON2101代做、代寫(xiě)Python/c++設(shè)計(jì)編程
  • CMT219代寫(xiě)、代做Java程序語(yǔ)言
  • 代做MATH1033、代寫(xiě)c/c++,Java程序語(yǔ)言
  • 代做CSCI 2525、c/c++,Java程序語(yǔ)言代寫(xiě)
  • COMP 315代寫(xiě)、Java程序語(yǔ)言代做
  • 昆明生活資訊

    昆明圖文信息
    蝴蝶泉(4A)-大理旅游
    蝴蝶泉(4A)-大理旅游
    油炸竹蟲(chóng)
    油炸竹蟲(chóng)
    酸筍煮魚(yú)(雞)
    酸筍煮魚(yú)(雞)
    竹筒飯
    竹筒飯
    香茅草烤魚(yú)
    香茅草烤魚(yú)
    檸檬烤魚(yú)
    檸檬烤魚(yú)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明旅游索道攻略
    昆明旅游索道攻略
  • 短信驗(yàn)證碼平臺(tái) 理財(cái) WPS下載

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網(wǎng) 版權(quán)所有
    ICP備06013414號(hào)-3 公安備 42010502001045

    美女扒开腿免费视频_蜜桃传媒一区二区亚洲av_先锋影音av在线_少妇一级淫片免费放播放_日本泡妞xxxx免费视频软件_一色道久久88加勒比一_熟女少妇一区二区三区_老司机免费视频_潘金莲一级黄色片_精品国产精品国产精品_黑人巨大猛交丰满少妇
    中文字幕无码毛片免费看| 精品人妻无码中文字幕18禁| 丰满人妻一区二区三区53视频| 国产波霸爆乳一区二区| 久久国产高清视频| 亚洲欧美日韩色| www.狠狠爱| 韩国三级hd中文字幕有哪些| 免费a级黄色片| 一级黄色片日本| 亚洲图片综合网| 老司机福利在线观看| 国产精品熟女一区二区不卡| 丰满少妇一区二区三区| 9.1片黄在线观看| 日本人妻一区二区三区| 国产美女精品久久| 亚洲国产欧美日韩在线| 国产精品亚洲无码| 91日韩中文字幕| 亚洲欧美色图视频| 爱爱视频免费在线观看| 99久久国产精| 超级砰砰砰97免费观看最新一期| 波多野结衣 在线| 美女被艹视频网站| 国产成人av一区二区三区不卡| 麻豆精品一区二区三区视频| 中文字幕被公侵犯的漂亮人妻| 丰满人妻一区二区三区大胸| 亚洲av熟女国产一区二区性色| 男人操女人下面视频| 中文字幕欧美激情极品| 7788色淫网站小说| 免费黄色在线播放| 一级特黄曰皮片视频| 国产熟女高潮一区二区三区| 少妇被躁爽到高潮无码文| 午夜精产品一区二区在线观看的| 99精品一区二区三区无码吞精| 中文字幕avav| 日韩女优一区二区| 国产又粗又长又黄的视频| 舐め犯し波多野结衣在线观看| 国产精品1000部啪视频| 亚洲av成人片色在线观看高潮| 国产精品嫩草69影院| 国产精品成人免费观看| 亚洲av无码一区二区三区在线| 美女福利视频网| 亚洲色图 激情小说| 最近中文字幕免费视频| 国产全是老熟女太爽了| 国产麻豆天美果冻无码视频 | 偷偷色噜狠狠狠狠的777米奇| 日本黄色三级网站| 潘金莲一级淫片aaaaa| 国产人妻精品午夜福利免费| 性折磨bdsm欧美激情另类| 无码人妻久久一区二区三区蜜桃| 日韩精品xxx| 国产老熟女伦老熟妇露脸| 亚洲第一黄色网址| 国内精品卡一卡二卡三| 永久av免费网站| 天天看片中文字幕| 久久久久久国产精品日本| 欧美激情一区二区三区p站| 亚洲av无码一区二区三区网址 | 波多野结衣欲乱| 91精品一区二区三区蜜桃| 国产免费一区二区三区四区| 原创真实夫妻啪啪av| 一本加勒比波多野结衣| 欧美黄色高清视频| aaaaa黄色片| 一区二区精品免费| 午夜爽爽爽男女免费观看| 国产日韩视频一区| 91久久免费视频| 手机在线免费看片| 国产成人精品无码片区在线| caopeng视频| 人妻精品久久久久中文字幕69| 中文字幕在线免费看线人| 中日韩一级黄色片| 日本丰满少妇裸体自慰 | 欧美风情第一页| 青青青视频在线播放| www.99re7| 白白色免费视频| 亚洲av无一区二区三区久久| 青青草福利视频| 9.1人成人免费视频网站| 亚洲精品国产91| 欧美极品jizzhd欧美仙踪林| 永久免费观看片现看| 亚洲熟女乱综合一区二区三区| 刘亦菲国产毛片bd| 日本黄色特级片| 亚洲精品久久久久久| 日韩精品久久久久久久的张开腿让| 超碰caoprom| 美国黄色小视频| 少妇的滋味中文字幕bd| 99久久人妻无码中文字幕系列| 久久国产美女视频| 欧美一区二区三区观看| 色婷婷在线影院| 久久无码人妻精品一区二区三区 | 欧美一区二区三区影院| 超碰人人人人人人人| 麻豆精品免费视频| 亚洲激情视频小说| 中文字幕在线观看的网站| 日本69式三人交| 日本一卡二卡在线| 国产 中文 字幕 日韩 在线| 李丽珍裸体午夜理伦片| 成人区人妻精品一区二| 日韩高清一二三区| 无码人妻丰满熟妇啪啪网站| 波多野结衣不卡视频| 精品自拍偷拍视频| 人妻精品久久久久中文字幕69| 国产成人av片| 99re久久精品国产| 欧洲美熟女乱又伦| 顶级黑人搡bbw搡bbbb搡| 无码人妻一区二区三区在线视频| 337p日本欧洲亚洲大胆张筱雨| 国产一精品一aⅴ一免费| 玖草视频在线观看| 丰满的亚洲女人毛茸茸| 黄色香蕉视频在线观看| 制服.丝袜.亚洲.中文.综合懂| 国内精品免费视频| 中文字幕5566| 天美传媒免费在线观看| 一区二区三区人妻| jizz中文字幕| 日本成人在线免费| 亚洲最大成人网站| 国模无码国产精品视频| 99re久久精品国产| 九九热最新地址| 无码国产69精品久久久久同性| 成人免费黄色小视频| 一级欧美一级日韩片| 久久午夜精品视频| 丰满少妇中文字幕| 91视频啊啊啊| 99re这里只有| 日本一级片免费| 国产大学生av| 成人无码av片在线观看| 久久爱一区二区| 台湾佬美性中文| 女~淫辱の触手3d动漫| 337p日本欧洲亚洲大胆张筱雨| 国产美女视频免费观看下载软件| 亚洲综合色一区| 全网免费在线播放视频入口 | 亚洲精品一二三四| 国产二级一片内射视频播放 | 性爱在线免费视频| 亚洲国产成人精品综合99| 亚洲久久久久久| 亚洲欧美精品久久| 一级特黄曰皮片视频| 91aaa在线观看| 黄色a一级视频| 国产探花在线视频| 麻豆视频免费在线播放| 制服下的诱惑暮生| 免费看污片的网站| 在线播放第一页| 日本xxxxxxxxx18| 日本aaa视频| 国产裸体视频网站| 天天操天天干天天操天天干| 欧美日韩一区二区区| 亚洲天堂小视频| 综合 欧美 亚洲日本| yy1111111| 国产十六处破外女视频| 国产免费一区二区三区四区| 女~淫辱の触手3d动漫| 精品人妻一区二区三区免费| 精品人妻一区二区三区蜜桃视频| 欧亚乱熟女一区二区在线| 日韩三级在线观看视频| 精品一区二区三区蜜桃在线| 国产精品手机在线观看| 日本一级片免费| 91香蕉视频在线播放| 亚洲无人区码一码二码三码的含义| 极品白嫩少妇无套内谢| 欧洲第一无人区观看| 91精品人妻一区二区三区四区|