Seminar
Place: Room N1019, 121pm.
Fall 2014
 Olympia Hadjiliadis (Brooklyn College and Graduate Center)
Joint with Math Club and Math Seminar Series Title: Trends and Trades
 Delaram Kahrobaei (City Tech and CUNY Graduate Center)
 Date: October 14, 2014
 Recent Development on Using Group Theory in Information Security
Spring 2014 Schedule
 Enric Ventura (UPC, Barcelona, University of Politecnica Catalunya, Spain)
Date: March 13, 2014
Title: Using graphs to understand free groups
Fall 2012 Schedule  Delaram Kahrobaei (City Tech, GC, CUNY)
Date: October 9, 2012
Title: Research and Educational Funding opportunities in Algebra, Logic and Computation  Vladimir Shpilrain (City College, GC, CUNY)
Date: November 13, 2012
Title: Noncommutative Cryptography  Corina Calinescu (City Tech, CLAC, CUNY)
Date: December 4, 2012.
Title: On Applications of Vertex Operator Algebras in Lie Theory and Partition Theory
 David Finston (Brooklyn College, CUNY)
Date: December 11, 2012.
Title: Exotic Affine Spheres and Cancellation Problem
Spring 2012 Schedule
 Ben Steinberg (City College, CUNY)
Date: April 3, 2012
 Victoria Gitman (City Tech, CUNY)
Date: February 14, 2012.
Fall 2011 Schedule
 Danny Kopec (Brooklyn College, CUNY)
Date: October 25, 2011
Fall 2010 Schedule
 Enric ventura (Barcelona, Spain)
Date: November 23, 2010 Title: Most Groups Are Hyperbolic ... Or Most Groups Are Trivial? Spring 2010 Schedule
 Sam Coskey (CUNY):
Date: March 9, 2010 Title:
Abstract:
 Victoria Gitman :
Date: March 16, 2010 Title: Nonstandard Models of the Peano Axioms: Why not expect an iPhone app for them?
Abstract: The period of 16th19th century saw some of the greatest advances in Number Theory with the work of Euler, Gauss, Fermat, etc. But unlike Euclid in Elements, these mathematicians did not work under the constraints of a formal axiomatic system. It was not until the late 19th century that a fundamental shift in thinking marked a return to the formal mathematics of Euclid from two millennia earlier. In 1890, Guiseppe Peano proposed a formal axiomatization of number theory  the Peano Axioms. The Peano Axioms consisted of the fundamental properties of addition and multiplication together with induction. Together with most mathematicians of his time, Peano believed the natural numbers to be the unique structure satisfying the Peano Axioms and thought that every number theoretic property was provable from them. They turned out to be wrong on both accounts. In the early 20th century, it was shown that there are nonstandard models of the Peano Axioms – structures satisfying the Peano Axioms but not isomorphic to the natural numbers. In 1930’s, came the final blow as Gödel showed, in his famous Incompleteness Theorem, that the attempt to axiomatize the natural numbers was futile as any reasonable axiomatization would leave an unprovable assertion. It followed that there were nonstandard models of the Peano Axioms not just nonisomorphic to the natural numbers but having fundamentally different properties. In this talk, I will give a brief introduction to nonstandard models of the Peano Axioms and prove a famous result of Stanley Tennenbaum from 1959, which showed that there is no algorithmic construction of such a model.
Fall 2009 Schedule
 Larua Ghezzi :
Date: November 3, 2009 Title: Minimal free resolutions in Commutative Algebra, and applications to Computational Algebra.
Abstract: Broadly speaking, commutative algebra is the study of the solution set of polynomial equations in many variables, conveniently “encoded” in a commutative ring (the coordinate ring of the corresponding algebraic variety). We are interested in properties of these rings, such as dimension, depth, CohenMacaulayness.
A particularly convenient tool in this study is given by free resolutions, that is, the process of approximating a (complicated) ring with free modules in a finite number of steps. The maps between the free modules at each step are given by matrices. The advantage is that free resolutions are easily computable with software such as CoCoA and Macaulay.
In the first part of this talk I will give an overview of free resolutions. In the second part, to provide a more specific framework, I will briefly present my recent research on the minimal free resolution of points in the projective space.
 Delaram Kahrobaei
Date: November 17, 2009 Title: Graphic Arithmetic.
 G. Rosenberger (Technische universitat Dortmund):
Date: December 8, 2009 Title: On Asymptotic Densities and Generic Properties in Finitely generated Groups Doubles.
Abstract: The asymptotic density is a method to compute densities and/or probabilities within infinite finitely generated groups. If P is a group property, the asymptotic density determines the measure of the set of elements which satisfy P. Is the asymptotic density equal to 1, we say that the property P is generic in G. P is called an asymptotic visible property, if the corresponding asymptotic density is strictly between 0 and 1. If the asymptotic density is 0, the P is called negligible. We show that there is an interesting connection between the asymptotic properties of a group G and the asymptotic properties of its subgroups of finite index. Using this we give some applications for several classes of groups.
Spring 2009 Schedule
 Victoria Gitman
New York College of Technology City University of New York March 17, 2009
Title: Gödel's Proof: The First Incompleteness Theorem Part 1.
 Thomas Johnstone
New York College of Technology City University of New York March 24, 2009
Title: Gödel's Proof: The First Incompleteness Theorem Part 2.
Abstract for the March 17th, and the March 24th Seminar: In 1931, Kurt Gödel showed that for any reasonable axiomatization of number theory, such as the Peano Axioms, there are always statements that are true but not provable. This monumental result, known as the First Incompleteness Theorem, was the first in a series of discoveries asserting the fundamental limitations of formal mathematics. Together with the Second Incompleteness Theorem, Gödel’s work ended Hilbert’s Program of formalizing all known mathematics. In this twopart talk, we will introduce the historical context of Hilbert’s Program and prove Gödel’s result.
 Thomas Tradler
New York College of Technology City University of New York April 21, 2009
Title: Quantum Conundrums
Abstract: Since the invention of quantum mechanics over 100 years ago, physicists have been aware of the interpretational shortcomings provided by this highly successful theory. These problems have been exhibited most prominently in the statistical interpretation of experiments such as the doubleslit experiments, and of what is known as Schrödinger's cat. Despite this highly unsatisfactory state of affairs and much theoretical progress, which lead to fields such as quantum field theory, string theory, and noncommutative geometry, the statistical interpretation of quantum mechanics is still fundamental to its understanding. In my talk, I will review some of the main experiments that display the seaming contradictions of the theory, and describe some of the (algebraic) attempts to obtain a more satisfactory interpretation. The hope is that one day a theory will emerge which will resolve these issues by providing a more reasonable setting than the one currently used.
 Sereta Scott (Computer System Technology New York City College of Technology)
Supervised by: Dr. Delaram Kahrobaei April 28, 2009
Title: ElGamal's and Schnorr's Digital Signatures
Abstract: In this paper, we present the topic digital signature, showing how Schnorr signature scheme, a variant of ElGgamal’s scheme is used to enhance the security of smart card technology. We show how Cryptographic algorithm method is used to generate signature and signature verification. Cryptography is about the prevention and detection of cheating and other malicious activities in data security. ElGamal’s and Schnorr’s Digital Signature Schemes are widely used because of the difficulty of solving DiffieHelman key exchange, which involves the discrete log problems. In Schnorr’s Digital signature scheme is based in the same principle as the ElGamal’s except that Schnorr’s method first signs the message then applies the hash function, and ElGamal’s does the reverse of Schnorr’s. Also Schnorr’s Scheme minimizes the time to generate the signatures.
Fall 2008 Schedule
 Introduction to CLAC
Dr. Delaram Kahrobaei New York City College of Technology September 9
 Dr. Xiandong Li
New York City College of Technology City University of New York September 16 Title: Quantum Information Abstract: For decades the remarkable facts of quantum physics have alternately frustrated and delighted the best thinking of physicists from Einstein to Feynman. Until recently the physicists found surprisingly few realworld applications. Quantum information is a very important application of quantum mechanics. This talk will introduce the audience the basic knowledge from quantum mechanics to quantum information.
 Dr. Sean Zhang
College of Staten Island City University of New York October 7 Title: RFID Technology and RFID Authentication Protocols Abstract: Radio Frequency Identification (RFID) is an exciting emerging technology. Its use is expanded extremely wide and fast, and is increasingly ubiquitous around the globe: from large department store like WalMart, pharmaceutical, aviation, military goods logistics, manufacturer supply chain, library, hospital, and kindergarten, to livestock, money note, passport, and the latest NYC Marathon runner tracker. Along with the massive deployments of RFID systems, various security issues and privacy concerns have been brought up. In this talk, we will first explain what RFID technology is and how it works. Then introduce some mutual authentication protocols to tackle these security issues and privacy concerns.
 Digital Signatures
Ms. Sereta Scott City Tech Mr. Franklin Fung Polytechnic Institute of NYU October 28
Abstract: The notion of a digital signature may prove to be one of the most fundamental and useful inventions of modern cryptography. A signature scheme provides a way for each user to sign messages so that the signatures can later be verified by anyone else. More specifically, each user can create a matched pair of private and public signature for the message (using the signer's public key). The verifier can convince himself that the message contents have not been altered since the message was signed. Also the signer cannot later repudiate having signed the message, since no one but the signer possesses his private key. By analogy with the paper world, where one might sign a letter and seal it in an envelope, one can sign an electronic message using one's private key, and then seal the result by encrypting it with the recipient's public key. The recipient can perform the inverse operations of opening the letter and verifying the signature to electronic mail are quite widespread today already.
 Models of Theoretical Computing
Ms. Elisa Elshamy New York City College of Technology December 2
Title: How to think about algorithms in theory and practice
Abstract: In the early 20th century, the famous mathematician David Hilbert asked whether all mathematical problems could be solved by a mechanical procedure. Hilbert believed that the answer to his question would be affirmative, but this turned out to be very far from the truth. In order to tackle Hilbert’s question it was necessary to formalize what was meant by a mechanical procedure. In the 1930’s several mathematicians took up this challenge. Gödel came up with the recursive functions. Alan Turing, the father of modern computers, came up with the Turing machines. In the later years, following the development of actual computers, new formalizations such as unlimited register machines were introduced. There is no single formal definition of what is an algorithm. In this talk, I will give an informal intuition for the notion of algorithm and discuss the Turing machines and the unlimited register machines as two formalizations of this concept. The remarkable fact that came out of the study of theoretical and practical computation is that diverse models of computation ranging from Gödel’s recursive functions to Turing machines to modern computers all have equal computational capabilities: in disguise they all have exactly the same algorithms!

