Our research topics include among others:

**Nonsmooth and nonconvex analysis**

Classical differential theory can be generalized in natural way to more general classes of functions via subdifferentials and subgradients. Our main interests in nonsmooth and nonconvex analysis are related to the optimization theory. Especially, we are ineterested in various types, analytical and geometrical, optimality conditions for various types, single and multiobjective, optimization problems.The classical convexity assumptions can be weakened by different kind of notions of generalized convexity in order to maintain the sufficiency of the conditions. Thus, we study the optimality conditions for different problem classes, for instance, for mixed-integer nonsmooth problems, generalized pseudo-convex nonsmooth problems and both continuous and discrete multiobjective optimization problems.

**Nonsmooth optimization**

Nonsmooth optimization deals with the minimization or maximization of functions which do not have the differentiability properties required by the classical optimization theory. The field of nonsmooth optimization is significant, not only because of the existence of nondifferentiable functions arising directly in applications (e.g. in optimal control, engineering, mechanics, economics, data mining, machine learning and medical diagnosis), but also because several important methods for solving difficult smooth problems may lead for solving a nonsmooth optimization problems which are either smaller in dimension or simpler in structure (e.g. unconstrained optimization). Moreover, nonsmooth optimization techniques can be successfully applied to smooth problems but not vice a versa. Our main interest are in bundle methods for nonsmooth optimization, large-scale nonsmooth optimization, constrained nonsmooth optimization and nonsmooth applications.

**Multiobjective optimization**

In multiobjective optimization we have several, instead of one and only one, possible conflicting criteria to be optimized. This gives rise to totally new challenges both in theory and methodology when compared to the classical single-criteria optimization. Our goal is to respond to these challenges and to develop new theory and methods for multiobjective optimization. The theoretical issues we are especially interested includes, for example, optimality conditions and sensitivity analysis. In method development we concentrate mainly on interactive methods. We also look forward multiobjective applications arising from industry and other walks of life.

**Mixed integer nonlinear programming**

Mixed integer nonlinear programming (MINLP) refers to mathematical

programming with both continuous and discrete variables and nonlinearities in the objective function and constraints. The use of MINLP is a natural approach of formulating problems where it is necessary to simultaneously optimize the system structure (discrete) and parameters (continuous). Recently, the research of MINLP has experienced tremendous growth. Nevertheless, the lack of numerically efficient solvers is still evident, requiring still much effort on this important research area. Notably, all the current solution methods as well as the theory, they are based on, require the objective function and the constraints to be continuously differentiable, which is far too restrictive condition for many real-life problems. In mixed integer nonsmooth optimization this restriction is ment to be erased. The more precise research areas of our team are cutting plane methods for MINLPs and nonsmooth branch and bound-type algorithms.

**Optimization in molecular modeling**

Molecular modeling combines the theoretical foundations of chemistry

(and physics) with computational techniques to model or mimic the behavior of molecules. Predicting the molecular structure of molecules, identifying correlations between chemical structures, and designing molecules that interact in specific ways with other molecules (e.g. in drug design) are all good examples of tasks that benefit from molecular modeling. The optimization problems arising in molecular modeling are often both nonsmooth and highly nonconvex. The standard process to avoid these difficulties is to regularize the functions by some appropriate smoothing technique or by simply ignoring the nonsmooth terms during minimization procedure. Although these kind of simplifications may facilitate the numerical solution of the problem, they may also confuse the relationships between the model and the original system. Thus the solution obtained may not be satisfactory for the original problem. Furthermore, the large dimensionality of many real world problems sets its own requirements for the solution approaches. In our research we study new approaches and proper mathematical formulations to molecular modeling that do not require smoothing of the model.

**Metaheuristics and hybrids for global optimization**

Functions of many variables usually also have a large number of local

minima and maxima in real-life applications (i.e. they are nonconvex). In global optimization the goal is to find the smallest minimum or the biggest maximum of all minima or maxima, respectively. New efficient global optimization methods are needed because realistic mathematical models are often very complex and in addition to nonconvexity, contain nonsmoothness and large number of variables. This is the case, for example, in molecular modeling. During the last decade the development of various type of metaheuristics like evolutionary algorithms, simulated annealing, tabu search, ant colony optimization, etc. has been very popular. Although metaheuristics are very robust and reliable methods, the computational efficiency is not their strength. For these reasons the hybridization of metaheuristics together with efficient local optimization methods has proved to be a very promising approach in order to develop both efficient and reliable global optimization methods.

**Sensitivity and post-optimal analysis in multiobjective optimization**

In optimization a question of stability arises in the case where the set of feasible solutions and (or) the choice function depend on parameters, for which the area of change is known only. The main focus in this area is on finding numerical measures of stability (known as stability functions and stability radii) as well as formulating qualitative criteria which may guarantee solution stability.

**Robust optimization in discrete optimization and graph theory**

Robust optimization can be generally defined as a process which aims to produce solutions that take somehow into account possible realization of input parameters instead of producing the optimal solution for a normal situation, which rarely occurs. It is achieved by constructing new objective functions which play against the worst case realization of problem parameters under interval form of uncertainty. Formulating robust counterpart models for different problems of discrete optimization and graph theory as well finding efficient algorithms to solve them is the priority directions in this topic.

**Various optimality principles and their parameterization**

While in single objective optimization the concept of optimal solution is strictly defined and unique, in multiobjective optimization there is a big variety of ways how to define optimality which are usually referred to as optimality principles. Most common optimality principles, as Pareto, Nash, lexicographic and majority, are traditional but may not fully cover all of the decision maker preferences. Sometimes, introducing a parameterized version of optimality principles may reflect the desirable preference specific better. The focus is on analyzing various properties of optimal solutions under different optimality principles