IJCCI is a joint conference composed of three concurrent conferences: ICEC, ICFC and ICNC.
These three conferences are always co-located and held in parallel.
Keynote lectures are plenary sessions and can be attended by all IJCCI participants.
KEYNOTE SPEAKERS LIST
James Bezdek, University of Melbourne, Australia
Title: (Mostly) Fuzzy Clustering in Very Large Data Sets
Antonio Sala, Technical University Valencia, Spain
Title: Perspectives of Fuzzy Systems in Process Monitoring and Control
Simon M. Lucas, University of Essex, U.K.
Title: Information Theory and Game Strategy Learning
Panos Pardalos, University of Florida, U.S.A.
Title: Prior Knowledge in Supervised Classification Models
University of Melbourne
Jim received the PhD in Applied Mathematics from Cornell University in 1973. Jim is past president of NAFIPS (North American Fuzzy Information Processing Society), IFSA (International Fuzzy Systems Association) and the IEEE CIS (Computational Intelligence Society): founding editor the Int'l. Jo. Approximate Reasoning and the IEEE Transactions on Fuzzy Systems: Life fellow of the IEEE and IFSA; and a recipient of the IEEE 3rd Millennium, IEEE CIS Fuzzy Systems Pioneer, and IEEE technical field award Rosenblatt medals.
This talk addresses the problem of clustering in very large (VL) data sets, which are defined as any data you cannot load and process on an available computer in a single pass. The overall objectives are acceleration for loadable data; and feasibility for non-loadable data.The basic methodology is to progressively sample the VL data set until a termination criterion is reached. At this point, any basic algorithm that is efficiently extensible can be applied to the sample. Finally, the results of the sample run are extended non-iteratively to the rest of the data. Many iterative algorithms besides clustering can use this approach it successfully, and it usually scales well as the size of the data set increases.
When the basic algorithm is clustering, the non-iterative extension phase is essentially a classifier, so this approach is closely related to (but not the same as) semi-supervised clustering in VL data. The remainder of this talk develops models for three kinds of data (image, feature vector, dissimilarity). The three schemes are illustrated with numerical examples using two clustering algorithms (fuzzy c-means and Gaussian mixture decomposition).
Technical University Valencia
Prof. Antonio Sala received an M.Sc. degree in electrical engineering in 1993 and the Ph.D. degree in control systems in 1998 both from Valencia Technical University, Spain. Since 1993, he has been with the Systems and Control Engineering Department, UPV, where he is currently full Professor,teaching in a wide range of subjects in areas such as linear systems theory, multivariable control, intelligent control, and has supervised four Ph.D. theses. He has coauthored 24 papers in middle or top impact journals, and the book Multivariable Control Systems (Springer), and he is co-editor of Iterative Identification and Control (Springer). He is member of IEEE, IFAC (spanish section), Department Vice-Head and has served 3 years on IFAC publications executive committee, and also as evaluator for the Spanish accreditation board (ANECA). He is associate Editor of IEEE Trans. On Fuzzy Systems. His current research interests include fuzzy control, networked control systems and process control applications.
Fuzzy control was initially introduced as a model-free control design method based on the knowledge of a human operator; however, current research is almost exclusively devoted to Lyapunov-based fuzzy control methods that can guarantee stability and robustness of the closed-loop system. Techniques for identifying fuzzy models and designing model-based controllers are reviewed in this article. The role of some of these reasoning-free approaches in other tasks such as system monitoring is also reviewed.
Simon M. Lucas
University of Essex
Prof. Simon M. Lucas (SMIEEE) is a full professor of computer science at the University of Essex (UK) where he leads the Game Intelligence Group. His main research interests are games, evolutionary computation, and machine learning, and he has published widely in these fields with over 140 peer-reviewed papers, mostly in leading international conferences and journals. He was chair of IAPR Technical Committee 5 on Benchmarking and Software (2002 - 2006) and is the inventor of the scanning n-tuple classifier, a fast and accurate OCR method. He was appointed inaugural chair of the IEEE CIS Games Technical Committee in July 2006, has been competitions chair for many international conferences, and co-chaired the first IEEE Symposium on Computational Intelligence and Games in 2005. He was program chair for IEEE CEC 2006, program co-chair for IEEE CIG 2007, and for PPSN 2008. He is an associated editor of IEEE Transactions on Evolutionary Computation, and the Journal of Memetic Computing. He has given invited keynote talks and tutorials at many conferences including IEEE CEC, IEEE CIG, and PPSN. Professor Lucas was recently appointed as the founding Editor-in-Chief of the IEEE Transactions on Computational Intelligence and AI in Games.
(more information can be found at: http://dces.essex.ac.uk/staff/lucas/)
Games have provided some of the most engaging challenges for AI since the birth of the field.
From an AI and machine learning perspective, the most interesting examples are those systems which can learn to play games at an expert level without being explicitly programmed to do so. The main algorithms for doing this are based on evolution (or co-evolution) and temporal difference learning.
In this talk I'll show how elementary information theory can be used to place upper-bounds on the rate at which these algorithms can learn. This has important implications for what can realistically be learned from a specified number of games played, given the setup of a particular algorithm.
Perhaps more importantly, this perspective also suggests ways to optimise existing algorithms and even design new ones that are able to learn more efficiently and provide a more informative view of the space of possible players. I will illustrate this with some examples.
University of Florida
Dr. Panos Pardalos is Distinguished Professor of Industrial and Systems Engineering at the University of Florida. He is also affiliated faculty member of the Computer Science Department, the Hellenic Studies Center, and the Biomedical Engineering Department. He is also the director of the Center for Applied Optimization.
Dr. Pardalos obtained a PhD degree from the University of Minnesota in Computer and Information Sciences. He has held visiting appointments at Princeton University, DIMACS Center, Institute of Mathematics and Applications, FIELDS Institute, AT&T Labs Research, Trier University, Linkoping Institute of Technology, and Universities in Greece.
He has received numerous awards including, University of Florida Research Foundation Professor, UF Doctoral Dissertation Advisor/Mentoring Award, Foreign Member of the Royal Academy of Doctors (Spain), Foreign Member Lithuanian Academy of Sciences, Foreign Member of the Ukrainian Academy of Sciences, Foreign Member of the Petrovskaya Academy of Sciences and Arts (Russia), and Honorary Member of the Mongolian Academy of Sciences.
Dr. Pardalos received the degrees of Honorary Doctor from Lobachevski University (Russia) and the V.M. Glushkov Institute of Cybernetics (Ukraine), he is a fellow of AAAS, a fellow of INFORMS, and in 2001 he was awarded the Greek National Award and Gold Medal for Operations Research.
Dr. Pardalos is a world leading expert in global and combinatorial optimization. He is the editor-in-chief of the Journal of Global Optimization, Journal of Optimization Letters, and Computational Management Science. In addition, he is the managing editor of several book series, and a member of the editorial board of several international journals. He is the author of 8 books and the editor of several books. He has written numerous articles and developed several well known software packages. His research is supported by National Science Foundation and other government organizations. His recent research interests include network design problems, optimization in telecommunications, e-commerce, data mining, biomedical applications, and massive computing.
Dr. Pardalos has been an invited lecturer at many universities and research institutes around the world. He has also organized several international conferences.
Classifiers built through supervised learning techniques are widely used in experimental sciences. Examples are neural networks, decision trees and support vector machines.
Recently, when knowledge is formalized as a set of linear constraints, an extension of those classifiers has been proposed. The resulting classifiers have lower complexity and half the misclassification error, with respect to the original methods.
In this talk, we show how to extract knowledge from data to enhance classification models.
The overall methods guarantees that the number of points in the training set is not increased and the resulting model does not over-fit the problem. Some case studies are provided, based on biomedical, genomic and proteomic data sets taken from the literature.
This is joint work with Mario Guarracino [firstname.lastname@example.org].