Ofir Gorodetsky (Tel-Aviv University) Variance of sums of two squares in short intervals, and fractional divisor functions Landau's Theorem tells us that the density of positive integers up to X which are sums of two squares is proportional to 1/sqrt(ln(X)). A much harder problem is to understand the distribution of sums of two squares in the short interval [x,x+Delta(x)] where x is "random" (varies in [1,X]), Delta grows with x, and X goes to infinity. Recent works of Smilansky and Freiberg-Kurlberg-Rosenzweig assumed unproven conjectures to show that the moments (of the aforementioned random variable), normalized suitably, converge to those of a Poisson-distributed variable. They treated only very short intervals (Delta grows like square root of ln(x)). I will explain how one can obtain unconditional results in the analogous setting of the polynomial ring over a large finite field using monodromy calculations of Katz and an approach pioneered by Keating and Rudnick. Our methods give the variance of the distribution, and allow Delta(x) to grow like x^epsilon for any epsilon in (0,1). This is joint work with Brad Rodgers.