Modelagem descritiva e preditiva da mudança de processo seletivo para ingresso no ensino superior

Joel Felipe de Oliveira Gaya, Sidnei da Fonseca Pereira Jr, Paula Fernanda Schiavo, Eduardo Borges, Silvia Silva da Costa Botelho

Resumo


Um dos desafios para as instituições de ensino superior é obter métricas que auxiliem o acompanhamento do desempenho do discente tendo em vista as constantes transformações do ensino no país. A mineração de dados vem sendo uma das principais ferramentas utilizadas por conseguir extrair informação implícita, previamente desconhecida e potencialmente útil para tomada de decisão. Neste trabalho foi analisada uma base de dados de estudantes do curso de Engenharia de Computação da Universidade Federal do Rio Grande (FURG), onde houve uma mudança na forma de avaliação para o ingresso dos alunos no ensino superior. Foram gerados modelos para avaliar o impacto desta alteração os quais mostram que o desempenho dos alunos diminuiu. Utilizando modelos de dados preditivos, percebeu-se que a idade, as notas de ingresso e o número de repetições nas disciplinas são fatores preponderantes para que aluno conclua o curso.

Texto completo:

PDF

Referências


bibitem[Agrawal, Srikant et al. 1994]{agrawal1994}

abntrefinfo{Agrawal, Srikant et al.}{AGRAWAL; SRIKANT et al.}{1994}

{AGRAWAL, R.; SRIKANT, R. et al. Fast algorithms for mining association rules.

In: emph{Proc. 20th int. conf. very large data bases, VLDB}. [S.l.: s.n.],

v.~1215, p. 487--499.}

bibitem[Bhargava et al. 2013]{bhargava2013}

abntrefinfo{Bhargava et al.}{BHARGAVA et al.}{2013}

{BHARGAVA, N. et al. Decision tree analysis on j48 algorithm for data mining.

emph{Proceedings of International Journal of Advanced Research in Computer

Science and Software Engineering}, v.~3, n.~6, 2013.}

bibitem[Boser, Guyon e Vapnik 1992]{boser:1992}

abntrefinfo{Boser, Guyon e Vapnik}{BOSER; GUYON; VAPNIK}{1992}

{BOSER, B.~E.; GUYON, I.~M.; VAPNIK, V.~N. A training algorithm for optimal

margin classifiers. In: ACM. emph{Proceedings of the Annual Workshop on

Computational Learning Theory}. [S.l.], 1992. p. 144--152.}

bibitem[Breiman 2001]{breiman:2001}

abntrefinfo{Breiman}{BREIMAN}{2001}

{BREIMAN, L. {Random forests}.

emph{Machine learning}, Kluwer Academic Publishers, v.~45, n.~1, p. 5--32,

}

bibitem[Breiman et al. 1984]{breiman84}

abntrefinfo{Breiman et al.}{BREIMAN et al.}{1984}

{BREIMAN, L. et al. emph{Classification and Regression Trees}. Monterey, CA:

Wadsworth and Brooks, 1984.}

bibitem[Cohen 1995]{cohen:1995}

abntrefinfo{Cohen}{COHEN}{1995}

{COHEN, W.~W. Fast effective rule induction. In: emph{Proceedings of the

International Conference on Machine Learning}. [S.l.: s.n.], 1995. p.

--123.}

bibitem[Deshpande e Thakare 2010]{deshpande2010}

abntrefinfo{Deshpande e Thakare}{DESHPANDE; THAKARE}{2010}

{DESHPANDE, S.; THAKARE, V. Data mining system and applications: A review.

emph{International Journal of Distributed and Parallel systems (IJDPS)}, v.~1,

n.~1, p. 32--44, 2010.}

bibitem[Gaya et al. 2016]{joel2016}

abntrefinfo{Gaya et al.}{GAYA et al.}{2016}

{GAYA, J.~O. et al. Vision-based obstacle avoidance using deep learning.

emph{Latim America Robotics Symposium}, 2016.}

bibitem[Hall et al. 2009]{hall2009weka}

abntrefinfo{Hall et al.}{HALL et al.}{2009}

{HALL, M. et al. The weka data mining software: an update.

emph{ACM SIGKDD explorations newsletter}, ACM, v.~11, n.~1, p. 10--18, 2009.}

bibitem[Han 2005]{han05}

abntrefinfo{Han}{HAN}{2005}

{HAN, J. emph{Data Mining: Concepts and Techniques}. San Francisco, CA, USA:

Morgan Kaufmann Publishers Inc., 2005.

ISBN 1558609016.}

bibitem[Haykin 2007]{haykin:2007}

abntrefinfo{Haykin}{HAYKIN}{2007}

{HAYKIN, S. emph{Neural Networks: A Comprehensive Foundation}. Upper Saddle

River, USA: Prentice-Hall, Inc., 2007.

ISBN 0131471392.}

bibitem[John e Langley 1995]{john:1995}

abntrefinfo{John e Langley}{JOHN; LANGLEY}{1995}

{JOHN, G.~H.; LANGLEY, P. Estimating continuous distributions in bayesian

classifiers. In: MORGAN KAUFMANN PUBLISHERS INC. emph{Proceedings of the

Conference on Uncertainty in Artificial Intelligence}. [S.l.], 1995. p.

--345.}

bibitem[Kabakchieva 2013]{kabakchieva2013}

abntrefinfo{Kabakchieva}{KABAKCHIEVA}{2013}

{KABAKCHIEVA, D. Predicting student performance by using data mining methods

for classification.

emph{Cybernetics and information technologies}, v.~13, n.~1, p. 61--72, 2013.}

bibitem[Khan, Khiyal e Khattak 2015]{khan2015}

abntrefinfo{Khan, Khiyal e Khattak}{KHAN; KHIYAL; KHATTAK}{2015}

{KHAN, B.; KHIYAL, M. S.~H.; KHATTAK, M.~D. Final grade prediction of secondary

school student using decision tree.

emph{International Journal of Computer Applications}, Foundation of Computer

Science, v.~115, n.~21, 2015.}

bibitem[Mitchell 1997]{mitchell97}

abntrefinfo{Mitchell}{MITCHELL}{1997}

{MITCHELL, T. emph{Machine Learning}. [S.l.]: McGraw Hill, 1997.}

bibitem[MySQL 1995]{mysql1995}

abntrefinfo{MySQL}{MYSQL}{1995}

{MYSQL, A. emph{MySQL: the world's most popular open source database}. [S.l.]:

MySQL AB, 1995.}

bibitem[Pandey e Pal 2011]{pandey2011}

abntrefinfo{Pandey e Pal}{PANDEY; PAL}{2011}

{PANDEY, U.~K.; PAL, S. Data mining: A prediction of performer or

underperformer using classification.

emph{arXiv preprint arXiv:1104.4163}, 2011.}

bibitem[Platt 1999]{platt:1999}

abntrefinfo{Platt}{PLATT}{1999}

{PLATT, J.~C. Probabilistic outputs for support vector machines and comparisons

to regularized likelihood methods. In: emph{Proceedings of the Advances in

Large Margin Classifiers}. [S.l.]: MIT Press, 1999.}

bibitem[Prasad e Babu 2013]{prasad2013}

abntrefinfo{Prasad e Babu}{PRASAD; BABU}{2013}

{PRASAD, G. N.~R.; BABU, A.~V. Mining previous marks data to predict students

performance in their final year examinations. In: ESRSA PUBLICATIONS.

emph{International Journal of Engineering Research and Technology}. [S.l.],

v.~2, n. (February-2013).}

bibitem[Prass 2004]{prass04}

abntrefinfo{Prass}{PRASS}{2004}

{PRASS, F.~S. emph{KKD: Processo de descoberta de conhecimento em bancos de

dados}. 2004. 10--14~p.

Grupo de Interesse Em Engenharia de Software, Florianópolis, v. 1.}

bibitem[Quinlan 1979]{quinlan79}

abntrefinfo{Quinlan}{QUINLAN}{1979}

{QUINLAN, J.~R. Discovering rules by induction from large collections of

examples. In: MICHIE, D. (Ed.). emph{Expert Systems in the Micro-electronic

Age}. Edinburgh: Edinburgh University Press, 1979.}

bibitem[Quinlan 1986]{quinlan86}

abntrefinfo{Quinlan}{QUINLAN}{1986}

{QUINLAN, J.~R. Induction of decision trees.

emph{Mach. Learn.}, Kluwer Academic Publishers, Hingham, MA, USA, v.~1, n.~1,

p. 81--106, 1986.

ISSN 0885-6125.}

bibitem[Quinlan 1992]{quinlan92}

abntrefinfo{Quinlan}{QUINLAN}{1992}

{QUINLAN, J.~R. Learning with continuous classes. In: emph{Proceedings of the

Australian Joint Conference on Artificial Intelligence}. Singapore: World

Scientific, 1992. v.~92, p. 343--348.}

bibitem[Quinlan 1993]{quinlan93}

abntrefinfo{Quinlan}{QUINLAN}{1993}

{QUINLAN, J.~R. emph{C4.5: programs for machine learning}. San Francisco, CA,

USA: Morgan Kaufmann Publishers Inc., 1993.

ISBN 1-55860-238-0.}

bibitem[Tan, Steinbach e Kumar 2005]{kumar05}

abntrefinfo{Tan, Steinbach e Kumar}{TAN; STEINBACH; KUMAR}{2005}

{TAN, P.-N.; STEINBACH, M.; KUMAR, V. emph{Introduction to Data Mining}.

[S.l.]: Addison-Wesley, 2005.}

bibitem[Witten e Frank 2011]{witten:2011}

abntrefinfo{Witten e Frank}{WITTEN; FRANK}{2011}

{WITTEN, I.~H.; FRANK, E. emph{Data Mining: Practical machine learning tools

and techniques}. [S.l.]: Morgan Kaufmann, 2011.}

bibitem[Yadav e Pal 2012]{yadav2012}

abntrefinfo{Yadav e Pal}{YADAV; PAL}{2012}

{YADAV, S.~K.; PAL, S. Data mining: {A} prediction for performance improvement

of engineering students using classification.

emph{CoRR}, abs/1203.3832, 2012.

Dispon{'i}vel em: url{http://arxiv.org/abs/1203.3832}.}




DOI: http://dx.doi.org/10.13037/ras.vol13n1.173

Apontamentos

  • Não há apontamentos.


Revista de Informática Aplicada - USCS/UFABC