EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR

CHRISTIAN TAFULA´

Abstract. In this paper we introduce the notion of pre-basis, which is a sequence such that some of its h-fold sumsets has positive lower asymptotic density, the least such h being its pre-order. Schnirelmann’s classical theory of sumsets is reinterpreted as characterizing additive bases in terms of pre-bases, and estimates for the order in terms of the pre-order are derived from the deep theorems of Mann and Kneser. Under certain regularity assumptions on the representation functions, we then derive estimates to the pre-order. This is achieved by studying sequences A = {a0 < a1 < a2 < . . .} ⊆ N for which A(2x) = O(A(x)) and a2n = O(an), and ends up providing a small shortcut to the proofs of the Schnirelmann-Goldbach theorem and Linnik’s elementary solution of Waring’s problem.

1. Introduction

Given A = {a0 < a1 < a2 < . . .} ⊆ N an infinite sequence and h ≥ 2 an Ph integer, the h-fold sumset hA is the set { i=1 ki : k1, . . . , kh ∈ A }. We say that A is an (resp. additive asymptotic basis) when there is h ≥ 2 such that N \ hA is empty (resp. finite), and the least such h is called the order (resp. asymptotic order) of A , denoted by O(A ) (resp. O(A )). The representation functions rA ,h(n) and sA ,h(x) count the number of solutions of k1 + k2 + ... + kh = n and k1 + k2 + ... + kh ≤ x resp., where h ≥ 1 is a fixed integer and ki ∈ A , considering permutations in the sense of the formal series: h ! P ah X X z X (1.1) za = r (n)zn, a∈A = s (n)zn. A ,h 1 − z A ,h a∈A n≥0 n≥0 Certain regularity properties on representation functions are essential to the elementary arguments showing that primes (Schnirelmann-Goldbach theorem, cf. Chapter 7 of Nathanson [4]) and sequences generated by poly- nomials (Generalized Waring’s problem, cf. Chapters 11 & 12 of Nathanson [6]) are additive bases. In this paper we investigate some of these regularity properties in a general setting. Denoting by A(x) the quantity |A ∩ [0, x]|, we study three types of sequences: the O-regular sequences (OR), which satisfy A(2x) = O(A(x)); the positively increasing sequences (PI), which satisfy a2n = O(an); and the

2010 Mathematics Subject Classification. Primary 11B13, 11B34. Key words and phrases. additive bases, representation functions, O-regular variation, Schnirelmann’s theorem, Schnirelmann’s method. 1 2 CHRISTIAN TAFULA´

O-regular plus sequences (OR+), which are sequences being both OR and PI. The first part of this work is dedicated to motivate the study of these sequences by the asymptotic behavior of their representation functions. In the second part we introduce the notion of additive pre-bases. Given a sequence A ⊆ N, consider the following densities: : σ(A ) := inf (A(n) − A(0))/n; n≥1 Asymptotic density: d(A ) := lim A(x)/x; x→+∞ Power density: w(A ) := lim log(A(x))/ log(x); x→+∞ the last two having lower (lim inf) and upper (lim sup) forms, signalized by a lower or upper bar, resp. (e.g. w, d). We define the Schnirelmann pre-order and pre-order of A as, respectively, Oe(A ) := min{h ≥ 1 : σ(hA ) > 0}, Oe(A ) := min{h ≥ 1 : d(hA ) > 0}. Thus pre-bases are sequences A with finite pre-order. The motivation for this concept comes from Schnirelmann’s characterization of additive bases (see Theorem 3.1), which basically states: • B ⊆ N is a basis ⇐⇒ Oe(B) < +∞ and {0, 1} ⊆ B; • A ⊆ N is an asymptotic basis ⇐⇒ Oe(A ) < +∞ and A is not contained in an arithmetic progression with common difference r ≥ 2. Our main result is a combination of effective bounds for the order of a sequence in terms of its pre-order and a regularity condition on representation functions that ensures a sequence is a pre-basis. This last ingredient relates to what is loosely referred to in the literature as Schnirelmann’s method. In order to state it, say a pre-basis A is stable when every A 0 ⊆ A with A0(x) = Θ(A(x)) is also a pre-basis. Moreover, say it is uniformly stable when there is h ≥ 1 such that Oe(A 0) ≤ h for every such A 0 ⊆ A .

Main Theorem (Effective Schnirelmann’s method). Let A ⊆ N be an OR+ sequence. If for some h ≥ 1 we have A(n)h X r (n)  ξ(n), where ξ(n)2 = O(x), A ,h n n≤x then A is a uniformly stable pre-basis, with Oe(A 0) ≤ h for every subsequence A 0 ⊆ A with A0(x) = Θ(A(x)). Furthermore: • If {0, 1} ⊆ A , then the order of A is at most dσ(hA )−1eh; • If A is not contained in an arithmetic progression, then the asymp- totic order of is at most dd(h )−1 + 1 ( 1−d(hA ) )eh + bd(h )ch. A A h d(hA ) A This statement is the combination of Corollary 3.4 with Theorem 3.1, and it constitutes a small shortcut to the proof of the Schnirelmann-Goldbach theorem and the elementary solution of the generalized Waring’s problem, covering their relatively easy parts. EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR SEQUENCES 3

Notation. A real function f is asymptotically defined when f is well-defined in [α, +∞) for some α ≥ 0. Whenever we write an “α” not specified at the context, this is the meaning that will be implicitly implied. Our use of asymptotic notation (Θ, ,O, , o, ∼) is standard, with the addition of “≺”, which has the same meaning as small-o (i.e. f ≺ g ⇐⇒ f = o(g))1.

2. O-regularity in sequences We start with some generalities. Given a sequence A ⊆ N, let n ∈ N and h ≥ 2 an integer. Then, from the formal series (1.1) one can immediately deduce the following recursive formulas: X rA ,h(n) = rA ,h−`(k)rA ,`(n − k) k≤n (2.1) X X sA ,h(n) = rA ,h−`(k)sA ,`(n − k) = sA ,h−`(k)rA ,`(n − k) k≤n k≤n For purposes of induction the case ` = 1 is usually enough, and we denote 1 rA ,1(n) simply by A (n). The following proposition motivates our study of O-regularity in sequences.

Proposition 2.1. For every sequence A ⊆ N,

sA ,h+1(x) sA ,h(x), ∀h ≥ 1. Proof. Choose some 0 < ε < 1 and define: 1−ε L1(x) := A(x) ; 1−ε Lh+1(x) := A(Lh(x)) , ∀h ≥ 1. We will use induction to show that, for all h ≥ 1,

sA ,h+1(x) (2.2) sA ,h(x)  . A(Lh(x))

As Lh+1(x) ≺ Lh(x) and Lh(x) 1 for all h, it will imply the statement of our proposition. Let us start with h = 1. Given that X 1 sA ,1(x) − sA ,1(x − L1(x)) = A (n) ≺ sA ,1(x),

x−L1(x)

k≤L1(n)

 sA ,1(n − L1(n)) · A(L1(n))  sA ,1(n) · A(L1(n)), thus (2.2) holds for h = 1.

1This is one of G. H. Hardy’s asymptotic symbols. Despite being a little old-fashioned, we think it provides a nice counterpart for small-o, in the same way “” does for Big-O. 4 CHRISTIAN TAFULA´

For the induction step, note that for every h ≥ 1 we have X 1 (2.3) rA ,h+1(n) = rA ,h(k) A (n − k) ≤ sA ,h(n) k≤n Taking h > 1 and assuming (2.2) valid for h − 1, we have, by (2.3), X sA ,h(x) − sA ,h(x − Lh(x)) = rA ,h(n)

x−Lh(x)

x−Lh(x)

 Lh(x)sA ,h−1(x)

sA ,h(x)  Lh(x) ≺ sA ,h(x), A(Lh−1(x)) then it follows that sA ,h(x) ∼ sA ,h(x − Lh(x)). Hence: X 1 sA ,h+1(n) = sA ,h(n − k) A (k) k≤n X 1  sA ,h(n − k) A (k)

k≤Lh(n)

 sA ,h(n − Lh(n)) · A(Lh(n))  sA ,h(n) · A(Lh(n)), thus (2.2) holds for all h.  This is a quite general estimate, and with the right restraints on the growth of A(x) we can actually say substantially more. 2.1. OR sequences. Let us introduce some bits of regular variation theory. An extensive treatment on this topic can be found in Bingham, Goldie & Teugels [1]. We will only need the theory from Chapters 2. Take f an asymptotically defined positive real function. We say f is • Slowly varying if f(λx) ∼ f(x) for all λ > 0; • Regularly varying if f(λx) ∼ λρf(x), for all λ > 0 and some ρ ∈ R; • O-regularly varying if f(λx)  f(x) for all λ > 0. The generality of these definitions lies on Karamata’s characterization theorem2, which states that if f(λx) ∼ g(λ)f(x) with g(λ) ∈ (0, +∞) for all λ > 0 and f is measurable, then f is regularly varying, i.e. there is ρ ∈ R such that g(λ) ≡ λρ. One can then promptly see how O-regular variation extends regular variation, and that is why we shall focus only on the former. Recall that a sequence A ⊆ N is OR when A(2x) = O(A(x)). This is equivalent to A being an O-regularly varying function. In fact, even more is true: Proposition 2.2 (OR sequences). Let A ⊆ N a sequence. The following are equivalent: (i) A is O-regularly varying;

2Theorem 1.4.1, p. 17 of Bingham et al. [1]. EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR SEQUENCES 5

(ii) A(2x) = O (A(x)); h (iii) For all h ≥ 1 it holds sA ,h(x) = Θ(A(x) ); (iv) For at least one h ≥ 1 it holds sA ,h(2x) = O(sA ,h(x)). Proof. (iii) =⇒ (iv) is obvious. To show (i) =⇒ (iii), we show that, in general, for every sequence A ⊆ N it holds h h (2.4) A(x/2) ≤ sA ,h(x) ≤ A(x) . The case h = 1 is clear, hence we proceed by induction. By the recursive formulas in (2.1), X 1 sA ,h(n) = sA ,h−1(k) A (n − k) k≤n X 1 ≤ sA ,h−1(n) A (n − k) k≤n h = sA ,h−1(n) · A(n) ≤ A(n) , and X 1 sA ,h(n) ≥ sA ,h−1(k) A (n − k) n/2≤k≤n X 1 ≥ sA ,h−1(n/2) A (n − k) n/2≤k≤n h = sA ,h−1(n/2) · A(n/2) ≥ A(n/2) , so (2.4) follows. We also use induction to show (iv) =⇒ (ii). The case h = 1 is trivial, so h suppose h ≥ 2. From (2.4) it follows sA ,h(x) ≤ A(x) ≤ sA ,h(2x), therefore h (iv) implies sA ,h(x)  A(x) . By the recursive formulas, h A(2n)  sA ,h(2n) X = rA ,h−1(k)sA ,1(2n − k) k≤2n X ≤ rA ,h−1(k)sA ,1(2n − k) n 0.  Having in mind that (2.4) holds in general, one can then say that OR sequences are the most well-behaved sequences in terms of the growth order, for it is sufficient to know A(x) to deduce the growth of sA ,h. 6 CHRISTIAN TAFULA´

2.2. PI sequences. In spite of that, just assuming A to be OR does not guarantee that abxc is O-regularly varying. For this part we need another bit of regular variation theory. Still following Bingham et al. [1], say an asymptotically defined positive real function f is almost increasing when there is m > 0 such that f(y) ≥ mf(x), ∀y ≥ x ≥ α.

This is equivalent to saying that f(x)  infy≥x f(y); almost decreasing functions can be defined in an analogous way, namely f(x)  supy≥x f(y). ∗ Consider then the upper (M ) and lower Matuszewska index (M∗) of f: ∗ −γ M (f) := inf {γ ∈ R : x f(x) is almost decreasing}, −γ M∗(f) := sup{γ ∈ R : x f(x) is almost increasing}. This definition is given in view of the almost-monotonicity theorem3. These indices are preserved under the asymptotic sign “”, and both are 4 finite if and only if f is O-regularly varying . Functions with M∗(f) > 0 are said to have positive increase. Our motivation to study these indices comes from the next two lemmas, the first of which we call by a fancy name. The OR–PI lemma (Corollary 2.6.2, p. 96 of Bingham et al. [1]). Let f an asymptotically defined, positive and locally integrable real function. The following are equivalent: Z x f(t) (i) dt = Θ(f(x)); α t ∗ (ii) 0 < M∗(f) ≤ M (f) < +∞; (iii) f is O-regularly varying and has positive increase. In our study, since A(x) is always an increasing function, we always have for granted that M∗(A) ≥ 0. However, a2n = O(an) if and only if A has positive increase. This fact follows from the next lemma.

Lemma 2.3. For any given sequence A ⊆ N,

1 ∗ 1 M∗(A) = ∗ , M (A) = ; M (ab·c) M∗(ab·c) adopting the conventions “1/0 = +∞” and “1/(+∞) = 0”.

∗ Proof. We will only show that M∗(A) = 1/M (ab·c). The argument for the other equation is entirely analogous, but with opposite inequalities. First, suppose M∗(A) > 0. Taking 0 < γ < M∗(A), there exists m > 0 and some x0 ∈ R+ for which A(y) A(x) ≥ m , ∀y ≥ x ≥ x . yγ xγ 0

3Theorem 2.2.2, p. 72 of Bingham et al. [1]. 4Theorem 2.1.7, p. 71 of Bingham et al. [1]. EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR SEQUENCES 7

Hence, there must be some n0 ∈ N such that

A(aN ) A(an) γ ≥ m γ , ∀N ≥ n ≥ n0; aN an but since A(ak) = k + 1, a 1 a N ≤ n , ∀N ≥ n ≥ n ; (N + 1)1/γ m1/γ (n + 1)1/γ 0 1/γ ∗ which means an/n is almost decreasing, thus M (ab·c) ≥ 1/M∗(A). To see that strict inequality cannot hold, we now take γ > M∗(A). Note that now we are considering the possibility of having M∗(A) = 0, for the next argument will also apply to the “0, +∞” case. For this chosen γ, the γ function A(x)/x will not be almost increasing; that is, ∀ε > 0, ∀M ∈ R+ there are yε > xε ≥ M such that

A(yε) A(xε) γ ≤ ε γ . yε xε

Letting nε := A(xε) and Nε := A(yε) + 1, the above inequality implies

aNε 1 anε 1/γ ≥ 1/γ 1/γ , (Nε − 1) ε nε 1/γ ∗ which means an/n cannot be almost decreasing, thus M (ab·c) < 1/γ. Hence equality must hold when M∗(A) > 0, and M∗(A) vanishes if and ∗ only if M (ab·c) = +∞.  Almost all interesting bases are both OR and PI, such as the sequence of primes and sequences generated by polynomials, which even have regularly varying counting functions. Nonetheless, not all bases are interesting! For instance, the sequence ! 17 [ n k k o k k+1 C := [0, 22 ] ∪ 22 + t(k − 1) : 0 ≤ t ≤ 22 ∪ [k · 22 , 22 ] k≥17 is an example of a 2-basis, i.e. a basis of order 2, which is not PI. It is a relatively simple exercise to show that 2C = N; to see that it is not PI, let 2k 2k nk be such that cnk = 2 . One can then show that nk ∼ 2 as k → +∞, and thus c2nk /cnk = k −o(k), implying c2n 6= O(cn), which by the last lemma yields M∗(C) = 0. On the other hand, the sequence

n 2k 3 2k 2 o D := P3 ∪ (2 ) + t : 0 ≤ t ≤ (2 ) , k ∈ N , where P3 is the sequence of cubes, is not OR. It is a basis, for it contains 5 2k 3 P3, which is well-known to be a 9-basis ; to see it is not OR, let nk = (2 ) . 2/3 It is not hard to show that D(nk) ≺ nk as k → +∞, and then D(2n ) D(n + n2/3) n2/3 k ≥ k k = 1 + k 1, D(nk) D(nk) D(nk)

5Wieferich–Kempner theorem (cf. Chapter 2 of Nathanson [4]). 8 CHRISTIAN TAFULA´ implying D(2x) 6= O(D(x)), which yields M∗(D) = +∞. That being said, we finish this section by characterizing OR+ sequences, which we recall are sequences that are both OR and PI.

Proposition 2.4 (OR+ sequences). Let A ⊆ N a sequence. The following are equivalent:

(i) A is O-regularly varying and has positive increase;

X an (ii) = Θ(a ); n bxc n≤x Z x A(t) (iii) dt = Θ(A(x)); 1 t

(iv) Both A(2x) = O(A(x)) and a2n = O(an); Z x sA ,h(t) (v) For at least one h ≥ 1 it holds dt = Θ(sA ,h(x)). 1 t Proof. In view of Proposition 2.2, the equivalence of the first three items follows directly from the OR–PI lemma and Lemma 2.3. It is also clear that (i) =⇒ (iv) and (iv) =⇒ (v). It only remains us to show that (v) =⇒ (i). By the OR–PI lemma, sA ,h is O-regularly varying and M∗(sA ,h) > 0. From O-regularity, items (iii) 1/h and (iv) of Proposition 2.2 tells us A(x)  sA ,h(x) . Since “” preserves Matuszewska indices, we have M∗(A) = M∗(sA ,h)/h > 0, hence A has positive increase. 

3. Schnirelmann and pre-bases 3.1. Effective Schnirelmann’s theorem. We start the second part of this paper by discussing the work of Schnirelmann. The concept of Schnirelmann density (σ) was first introduced by L. G. Schnirelmann in the 1930s, who showed, as part of his proof that the sequence of primes forms a basis, that every sequence with positive Schnirelmann density forms a basis6. Although some other results are called by this name, this will be the one we shall refer to as Schnirelmann’s theorem. Whereas this density works just fine when it comes to bases, it does not make much sense for asymptotic bases; its asymptotic counterpart is the lower asymptotic density (d), which is arguably more natural to consider. The celebrated theorems of Mann and Kneser provide lower bounds to the density of sumsets. Mann’s theorem (Theorem 3, p. 5 of Halberstam & Roth [3]). Given h + 1 T sequences B0,..., Bh ⊆ N, if 0 ∈ j≤h Bj then

σ(B0 + ... + Bh) ≥ min{1, σ(B0) + ... + σ(Bh)}.

6Theorem 4, p. 8 of Halberstam & Roth [3]. EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR SEQUENCES 9

Kneser’s theorem (Theorem 3.8, p. 985 of Pomerance & S´ark¨ozy[8]7). Given h + 1 sequences A0,..., Ah ⊆ N, either

A0(x) + ... + Ah(x) d(A0 + ... + Ah) ≥ lim inf x→+∞ x or there are numbers m, r0, . . . , rh ∈ N such that

(i) Aj ⊆ Cj for each j ≤ h, where Cj is the union of rj distinct congruence classes modulo m; (ii) (C0 + ... + Ch) \ (A0 + ... + Ah) is finite; (iii) d(A0 + ... + Ah) ≥ (r0 + ... + rh − h)/m. These bounds are known to be best possible (cf. Nathanson [5]), in the sense that for every h ≥ 2 there are Aj, Bj ⊆ N and λj ∈ [0, 1] for 1 ≤ j ≤ h with σ(Bj) = d(Aj) = λj such that equality holds. From these bounds one can derive an effective form of the previously mentioned theorem by Schnirelmann. Recall our definitions of Schnirelmann pre-order (Oe) and pre-order (Oe) given at the introduction, and also note that the power densities can be written in the alternate forms w(A ) = inf {γ ≥ 0 : A(x)  xγ}, w(A ) = sup{γ ≥ 0 : A(x)  xγ}. Given m ≥ 2, a sequence A ⊆ N is m-degenerate if A is entirely contained in some congruence class mod m, and nondegenerate otherwise. Although no novel methods are employed in the next proof, the following version of this result is not present in the literature. Theorem 3.1 (Effective Schnirelmann’s theorem). (i) A sequence B ⊆ N is a basis if and only if {0, 1} ⊆ B and Oe(B) is finite. Furthermore, if h ≥ Oe(B) then  1   1  (3.1) ≤ O(B) ≤ h; w(B) σ(hB) (ii) A sequence A ⊆ N is an asymptotic basis if and only if A is nonde- generate and Oe(A ) is finite. Furthermore, if h ≥ Oe(A ) then  1   1 1 1 − d(hA ) (3.2) ≤ O(A ) ≤ + h + bd(hA )ch. w(A ) d(hA ) h d(hA ) Proof. The lower bounds for both items follow from the elementary fact that h sA ,h(x) ≤ A(x) . Indeed, if h < 1/w(A ), then it follows that sA ,h(x) 6 x, thus a fortiori d(hA ) = 0. The rest is proved separately.

• Item (i): If B ⊆ N is a basis, then it must contain {0, 1} to ensure that hB contains the first h integers; also, for at least some h ≥ 1 we must have σ(hB) positive, e.g. for h = O(B), thus Oe(B) is finite.

7A full proof is given in Chapter I of Halberstam & Roth [3]. 10 CHRISTIAN TAFULA´

The converse is exactly Schnirelmann’s original theorem, but Mann’s theorem provides a better upper bound. Take h ≥ Oe(B) and let Bh := hB. Since σ(Bh) > 0, Mann’s theorem implies that, taking g ≥ 1/σ(Bh),

σ(gBh) ≥ min{1, gσ(Bh)} = 1, therefore ghB = N. Hence there must be some  1  ` ≤ h σ(hB) satisfying `B = N.

• Item (ii): If A ⊆ N is an asymptotic basis, then A must be nonde- generate, since otherwise it would always be contained in some arithmetic progression; also, d(hA ) must be positive for some h ≥ 1, e.g. for h = O(A ), thus Oe(A ) is finite. One could deduce the converse using only Schnirelmann’s original theorem, but Kneser’s theorem allows us to obtain better bounds. Our proof is inspired by the argument due to Nathanson and S´ark¨ozyin Theorem 3.9, p. 985 of Pomerance & S´ark¨ozy[8]. Let h ≥ Oe(A ) and Ah := hA . Taking g > 1/d(Ah), the sequence gAh must fall into the second case of Kneser’s theorem; using the same notation as in its statement, let r := rj for all 1 ≤ j ≤ g, which are equal because we are adding the same sequence g times. By items (i) and (iii) from the theorem applied to Ah, gr − (g − 1) g − 1 (3.3) 1 ≥ d(gAh) ≥ ≥ gd(Ah) − , m m therefore m ≤ (g − 1)/(gd(Ah) − 1). In the worst case scenario, i.e. when N \ Ah is not finite, we must have m ≥ 2. More than that: since from nondegeneracy there must be at least 2 coprime residues in A (mod m), we must have m > h, and thus, since r = |hA (mod m)|, it follows that (3.4) r ≥ h + 1 From (3.3) and (3.4) we deduce, using the upper bound for m, gr − (g − 1) d(gAh) ≥ m gh + 1 ≥ m gd(Ah) − 1 ≥ (gh + 1)(3.5) g − 1

We now can work out how large g must be for N \ gAh to be finite. If

h − d(Ah) + 1 g ≥ , hd(Ah) EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR SEQUENCES 11 then, reorganizing the expression in (3.5),

gd(Ah) − 1 d(gAh) ≥ (gh + 1) g − 1

g(ghd(Ah) + d(Ah) − h) − 1 = g − 1 g − 1 ≥ = 1 g − 1

From item (ii) of Kneser’s theorem, gAh contains all but finitely many nonnegative integers. This means that the same must be true for some g less than or equal to      h + 1 − d(Ah) 1 1 1 − d(hA ) = + hd(Ah) d(hA ) h d(hA )

Notice, however, that we also assumed g > 1/d(Ah). The above expression is always strictly greater than 1/d(Ah), except when d(Ah) = 1, for in this case it holds equality. Nonetheless, in this case we can at least guarantee that N \ 2Ah is finite, hence (3.2) follows.  The lower bounds are best possible in the aforementioned sense. This follows straight from the existence of thin bases (cf. Nathanson [7]). The upper bounds seem more elusive, though. To rephrase what Theorem 3.4 is telling us, let gcd(A ) be the minimum of gcd(a, b) with a, b ∈ A and define A − k := {n ∈ N : n + k ∈ A }. With this, consider the normalization of A :   ∗ n A := : n ∈ A − a0 . gcd(A − a0)

If A is an asymptotic basis then so is A − k for all k ≤ a0; furthermore, A ∗ is an asymptotic h-basis if and only if hA contains an infinite arithmetic progression of common difference gcd(A − a0) whereas (h − 1)A does not. Note that normalized sequences are nondegenerate. ∗ −1 Since d(A ) = gcd(A − a0) d(A ) and Oe(A ) = Oe(A ∪ {0, 1}), the previous theorem reduces the study of bases to that of pre-bases. Note that, however, it has no regard whatsoever for the exact value of the order or asymptotic order of a given sequence. The sense in which it happens is that since pre-bases are sequences with finite pre-order, a sequence is a basis (resp. asymptotic basis) if and only if it is a pre-basis and it contains {0, 1} (resp. is nondegenerate). Moreover, all bases and asymptotic bases are pre-bases, and if A is a pre-basis, then A ∪ {0, 1} a basis and A ∗ an asymptotic basis.

3.2. Effective Schnirelmann’s method. In spite of all that, in general, the hard work in showing that a sequence is an additive basis is precisely on showing it is a pre-basis. The proof of Schnirelmann-Goldbach theorem (cf. Chapter 7 of Nathanson [4]) achieves this by using elementary sieve theory, whereas Linnik’s elementary solution of Waring’s problem (cf. Chapter 2 of 12 CHRISTIAN TAFULA´

Gel’fond & Linnik [2]) uses instead elementary estimates on the number of solutions to certain linear Diophantine equations. The common theme on these two proofs is what is called Schnirelmann’s method. It is hard to pinpoint an exact procedure common to both proofs other than Schnirelmann’s theorem, but both rely heavily on a specific regularity of the representation functions involved. The next result describes a raw formulation of this regularity, and we call it the vanilla Schnirelmann’s method for its virtual generality. Proposition 3.2 (Vanilla Schnirelmann’s method). For any given sequence A ⊆ N, if for some h ≥ 1 we have 2 X s ,h(x) r (n)2  A , A ,h x n≤x then A is a pre-basis with Oe(A ) ≤ h. Proof. By the Cauchy-Schwarz inequality, !2 X X 2 rA ,h(n) ≤ |hA ∩ [0, x]| rA ,h(n) . n≤x n≤x Thus, by our hypothesis, 2 P  2 r ,h(n) xs (x) |h ∩ [0, x]| ≥ n≤x A  A ,h = x, A P 2 2 n≤x rA ,h(n) sA ,h(x) which implies d(hA ) > 0.  In principle, this is a very strong assumption. Note that given nonnegative integers x1, . . . , xn, we have Pn 2 n n !2 ( xk) X X k=1 ≤ x2 ≤ x , n k k k=1 k=1 with equality holding on the LHS iff x1 = x2 = ... = xn. One can then interpret the estimate from the proposition as asking for rA ,h to have no “peaks”. Corollary 3.4 formalizes this intuition when it comes to OR+ sequences. We derive it from the following.

Theorem 3.3. If A ⊆ N is an OR+ sequence, then

X r ,h(n) n A = Θ(x), ∀h ≥ 1. s (n) α

h Proof. As A is, in particular, an OR sequence, we have sA ,h(x)  A(x) , therefore M∗(sA ,h) = hM∗(A), which is nonzero by hypothesis. One can then deduce that, for all λ > 1, X X rA ,h(n)  rA ,h(n), n≤x x 1 it holds

1 X X rA ,h(n) n · rA ,h(n)  n , s ,h(x) s ,h(n) A x 1 we have

X r ,h(n) X r ,h(n) (3.7) n A  n A . s ,h(n) s ,h(n) α

Q Qλ x Let λ ≥ 2 an integer. Then x

Since sA ,h(`x) ≤ sA ,h(λx), the product on the RHS of the above is bounded, for it is a finite product. Thus (3.7) can further be reduced to ! Y log sA ,h(n)  x log (sA ,h(x)) . α

But this is easy to show. In fact, given that M∗(sA ,h) > 0, a fortiori ε sA ,h(x)  x for some ε > 0, therefore log(sA ,h(x))  log(x). Hence it boils down to ! Y log sA ,h(n)  x log (sA ,h(x)) α

This theorem is showing how good sA ,h(n)/n is as an approximation of rA ,h(n) when A is OR+. It can be rephrased in the following form: for all h ≥ 1 there is an arithmetic function ξh such that A(n)h X r (n)  ξ (n), where ξ (n) = O(x); A ,h n h h n≤x i.e. ξh(n) is bounded in average. P It is not hard to show that, in general, w(A ) > 0 implies n≤x ξh(n) is at most O(x log(x)) for all h. For some specific cases it is possible to obtain even better estimates, such as in the case of primes, for example, where ξ2(n) Q −1 8 can be taken to be p|n(1 + p ) . This fact it at the heart of the proof of the Schnirelmann-Goldbach theorem. Following Section 11.2 of Nathanson [6], let us first state the general notion of stability for additive bases. A sequence A ⊆ N is said to be a stable basis (resp. stable asymptotic basis) when A is a basis (resp. asymptotic basis) and every subsequence A 0 ⊆ A with {0, 1} ⊆ A 0 (resp. nondegenerate) and A0(x) = Θ(A(x)) is also a basis (resp. asymptotic basis). Recall that stable pre-bases are defined similarly, without the additional assumption of containing {0, 1} or being nondegenerate. Moreover, it allows a notion of uniformity. If Oe(A 0) ≤ h for all suitable A 0 ⊆ A , say A is a uniformly stable pre-basis w.r.t. (with respect to) h.

Corollary 3.4 (Effective Schnirelmann’s method). Let A ⊆ N be an OR+ sequence. If for some h ≥ 1 we have A(n)h X r (n)  ξ(n), where ξ(n)2 = O(x), A ,h n n≤x

8Theorem 7.2, p. 186 of Nathanson [4] EFFECTIVE SCHNIRELMANN’S METHOD FOR O-REGULAR SEQUENCES 15 then A is a uniformly stable pre-basis w.r.t. h. h Proof. As A is, in particular, an OR sequence, sA ,h(x)  A(x) . Our hypothesis then is equivalent to  2 X r ,h(n) n A  x. s (n) α 0, and consequently Oe(A ) ≤ h. Furthermore, for any subsequence A 0 ⊆ A with A0(x) = Θ(A(x)), A(n)h A0(n)h r 0 (n) ≤ r (n)  ξ(n)  ξ(n), A ,h A ,h n n 0 which lead us to conclude, by the same reasoning, that Oe(A ) ≤ h.  Remark. By a very similar argument, one can show that OR sequences satisfying the hypotheses from Proposition 3.2 will actually be uniformly stable pre-bases w.r.t. h. Remark. One should not be bothered by the fact that our representation functions count repetition, for it does not affect their practical interest when studying bases. In fact, define ( h ) X r0 (n) := {a , a , . . . , a } ⊆ : a = n A ,h 1 2 h A j j=1 0 P 0 0 and sA ,h(x) := n≤x rA ,h(n). It is simple to see that rA ,h(n) = Θ(rA ,h(n)), 0 for we have rA ,h(n)/h! ≤ rA ,h(n) ≤ rA ,h(n). The same, consequently, applies to s and s0, therefore results about growth order can be easily translated from one definition to the other.

References 1. N. H. Bingham, C. M. Goldie & J. L. Teugels, Regular Variation, Cambridge Univ. Press, 1989. 2. A. O. Gel’fond & Yu. V. Linnik, Elementary Methods in the Analytic Theory of Numbers, D. E. Brown translated version, Oxford: Pergamon Press, 1966. 3. H. Halberstam & K. F. Roth, Sequences, Revised ed., Springer, 1983. 4. M. B. Nathanson, : The Classical Bases, 2nd ed., Graduate Texts in Mathematics 164, Springer, 1996. 16 CHRISTIAN TAFULA´

5. M. B. Nathanson, Best possible results on the density of sumsets, in: Analytic number theory. Proceedings of a conference in honor of Paul T. Bateman (Illinois, 1989), B. C. Berndt et. al. (eds.), Boston: Birkh¨auser,1990, 395–403. 6. M. B. Nathanson, Elementary Methods in Number Theory, Graduate Texts in Mathe- matics 195, Springer, 1999. 7. M. B. Nathanson, Thin bases in additive number theory, Discrete Mathematics 312 (2012), 2069–2075. 8. C. Pomerance & A. S´ark¨ozy, Combinatorial Number Theory, in: Handbook of Combi- natorics, R. L. Graham, M. Gr¨otschel & L. Lov´asz(eds.), MIT Press, 1995, 976–1018.

Research Institute of Mathematical Sciences, Kyoto University, 606- 8502 Kyoto, Japan E-mail address: [email protected]