It Begs The Question
Hey…I'm just saying… And while we're at it, why are you defending them?

An Actual Math Mystery November 12, 2017

The Mystery Formula


On Nov. 1 I found the following sheet of notebook paper with some formulas written on it. 

It was tucked away in a remote corner of a bookshelf.


Here it is transcribed with some annotation and easier to read:


ln(Cp) + Sp = Sn

next  convert  Cp         see

ln( n / ln(n) ) + Sp = Sn

ln(n) – ln(ln(n)) + Sp = Sn

Sp = Sn+ ln(ln(n)) – ln(n)

ln(n) is same as Sn so make it so below

Sp = Sn + ln(ln(n)) – Sn

Sp = ln(ln(n))


The above was obviously written by me and it was also obvious (to me) that it concerned the relationship of the “vanilla” Harmonic Series and the Harmonic Series of Primes. As a refresher for the reader…

The “vanilla” Harmonic Series is:

The Harmonic Series of Primes is like the “vanilla” Harmonic Series except that all of the denominators of the terms are only primes:

The paper with the Mystery Formulas must have been part of an earlier project that I had abandoned about 3-4 months ago when I was not making any progress. Anyway, that project concerned trying to find a simple “proof” of the divergence of the Harmonic Series of Primes.


As I said, I had eventually abandoned the “simple proof” project some months ago because I was not making progress. But what is VERY STRANGE is that:

What is  written on that sheet of notebook paper above constitutes a “simple proof” I had been looking for!… Sort Of… Let me explain… It’s fascinating.


But first, I need to explain the nomenclature used. That is, what the “symbols” represent (it’s extremely simple but needs to be understood).

Sp is the sum of the terms of the Harmonic Series of Primes. That is, Sp  is the “value” of the series.

Sn is the sum of the terms of the “vanilla” Harmonic Series

Cp is the Count of the prime numbers in either Sp or Sn

Ln is the natural log



That said, the last formula on the sheet of notebook paper (above) is:

Sp = ln(ln(n))

The above formula expresses the “value” of the Harmonic Series of Primes.  You’ll find the above formula that expresses the value of  Sp     in many places (e.g. Wikipedia).  Anyway, as n goes to infinity we can easily see that Sp diverges!  That is,  the Harmonic Series of Primes diverges.

Sp diverges! This is what I was trying to show, in a simple manner, in the project I abandoned months ago (due to lack of progress).



So… Do the simple formulas on the Mysterious Sheet of Notebook Paper

actually represent a simple proof that I had been searching for?

Wellllllllll….. maybe.

It totally depends on whether the first formula is true. And that formula is:

Formula 1  –>       ln(Cp) + Sp = Sn


So is it true? Although I don’t remember doing so, I had apparently written this stuff down some months ago. If I had seen it somewhere I am pretty sure I’d remember. Now what’s really interesting is that, when I look at that first formula (just above),

I can’t think of any reason why adding the log of the count of terms with a prime denominator, to the value of the Harmonic Series of Primes, would equal the value of the vanilla Harmonic Series!!!!??

I can’t think of a reason; can you? But apparently I thought that in the past! Anyway, I’ll tell you what I can do. I can “test” the formula over billions of numbers to see if it looks like it holds water. So, while that’s not a proof, per se, it would be good evidence one way or the other. Luckily, doing such an experimental test would be easy because I had already written a SumOfReciprocals” program to help with my first attempt at finding a simple proof of the divergence of the Harmonic Series of Primes.

All that was needed was a few lines of code to display, at the end of each test, the values of each term in Formula 1 (above). For each test execution, the SumOfReciprocals program has already accumulated the values for Cp, Sp, and Sn so all that was needed was to display them in the log. Just below you can find a screen print of one such SumOfReciprocals test execution.  Click on it to see it full size.

As it turned out, the test results appear to show that the left side of the equation asymptotically approaches the right side of the equation. i.e.

ln(Cp) + Sp ~ Sn

Here is a spreadsheet showing the results of the above where the values of the denominators range from 1..109 (1 to 1 Billion).



The spreadsheet and associated chart show the results of





ln(Cp) + Sp



As the above ratio approaches 1, it demonstrates that the left side of the equation approaches the right side. That is, that ln(Cp) + Sp approaches Sn

So where does that leave us?

-1-  Based on some experimental evidence, it’s my conjecture that      ln(Cp) + Sp ~ Sn

-2- If the conjecture is true, then the series of formulas on the notebook paper constitutes a very simple proof of the divergence of the Harmonic Series of Primes.


So, all that said, how would we prove     ln(Cp) + Sp ~ Sn ?     I’ve thought about it for some time and I can’t think of any reason why ln(Cp) would play any role at all! But the experimental results appear to powerfully indicate otherwise!

It’s all a mystery and now the original project has morphed into something much more interesting…

Proving (or disproving)

ln(Cp) + Sp ~ Sn



The End













No Comments on An Actual Math Mystery
Categories: Uncategorized

CPU Bound Logging October 20, 2017

Windows Desktop Apps That Are CPU Bound…

So What’s The Problem?

This article is a bit esoteric but if you have an interest in Windows programming you may find it interesting.  That said…

Sometimes we’ll develop CPU bound programs (“apps”) that perform Bizillions of calculations. For example, a program that finds the prime factors of very large numbers (e.g. 100 digits or more) can run for a very long time. In these types of programs I’ll typically code the “Calculation function” to periodically check whether the user (me) wants to have intermediate results or other useful information “displayed.” Also, the code would additionally wait for the user to tell the Calculation function to pause, continue, or terminate after viewing the information that was displayed. All that said, here’s the problem…

The typical Windows Desktop app (WinForms or WPF) is a message-driven program. When you click on the button that says “Factor The Number” an event message is generated and is sent to the program to be read and processed. Eventually the event message works its way to its “FactorButton_Click” routine for processing. Visual Studio generates the “button click” routines in our programs but it’s up to us to add the code inside those generated functions/methods. That code that we supply will perform the calculations and may periodically display useful information in a “log” for the user to read. Examples of the kind of information displayed might be the number that’s currently being factored, how many factors have been found so far, how long it’s been working on the current number, etc.

The “log” where the app’s useful information is displayed as text would typically be a Multi-line TextBox. That is, a TextBox serves as the “log” and is the modern GUI version of “printf” (that we would have used “back in the day”).

The act of writing to the “log” (the TextBox) will generate “paint” messages which eventually make their way to the code for the TextBox that is functioning as the “log.” When the “paint” messages are processed by the TextBox code it will actually draw the text on the screen. But the problem is that the “paint” messages for the log often won’t get to the TextBox in a timely fashion as long as the calculation routine has control! Also, the user won’t be able to scroll the TextBox because mouse clicks and keypresses won’t get processed as long as the calculation routine continues to run. The messages (text) now exist in the log’s TextBox but we can’t effectively use it, or anything else on the app’s window until the “Calculation” routine finishes. The app is effectively “frozen” until the Calculation routine ends.

I have lately been writing numerous CPU Bound “calculation” programs and the above issue had been an ongoing pain in the butt. After thinking about the problem on and off for several months I finally decided to solve the problem once and for all. 

Note:  It’s occasionally been the case that when the problem is explained to another programmer they propose “just use the Yield() or Sleep() function.” However, we need to remember, Yield() and Sleep() allow other threads to get control… but the problem is that the code to process events for your own controls, like Buttons and TextBoxes, is NOT in other threads… It’s in your own thread!   And, the next thing often proposed is to use multiple threads as part of a solution;  No thanks… that’s way too complicated and error-prone.  Anywho… persistence and patience eventually led me to the following:

Here’s the basic idea as illustrated above…

The “Log” is contained in the Memory Mapped File. A utility Class named MMFlogger in a classlib is how an app would access the log. From the perspective of an app (or programmer), the file consists of just 2 very simple things:

.1. A long integer “command” that can be read or written. How the “command” (a long integer) is interpreted and used is completely up to app. For example, in the programs I write:

0 (zero) means “continue processing”

1 means “pause” and wait for a new command.

2 means “terminate” the calculations (but not the program).

.2. The second “field” in the file can be thought of as a giant string much like the Text field of a WinForms Textbox (e.g. myTextbox.Text += “\r\nAnother message for the Textbox”;).

A user app would typically use just the following 2 MMFlogger functions/methods like so:

long lCommand = myMMFlogger.ReadCommand();


string something = “\r\nAppend something to the giant string in the file”;


From the User’s (programmer’s) perspective it’s about as simple as it gets. It’s essentially like writing to a WinForms TextBox!

So ItBegsTheQuestion…

How is the User actually going to view the log?

In the flowchart you will see “Process B” which is a utility app/program that monitors the log in the MemoryMappedFile. It can be set to automatically (or manually) read the data (strings) in the log and write them to a TextBox for the User to view on the screen. Of course, the User can scroll forwards and backwards, select – copy – and paste to a text file or spreadsheet, etc. And there are buttons the User can click to have “commands” sent to the “Calculation app” (Process A) to tell it to pause, continue, terminate, etc. The User can also have the data (string) cleared from the log/file (perhaps in preparation for a new round of computations by the “Calculation app”). In a nutshell:

– We eliminate the log’s TextBox from the User app and replace it with a Memory Mapped File that the app uses just like it would use a TextBox but with the added benefit that it can now easily receive “commands” from the User (person).

– The User (person) now views and interacts with the log via a TextBox in the “Utility App” (Process B). In addition, the User (person) can now send “commands” to the “Calculation app” instructing it, for example, to pause, continue, terminate the calculations, etc.

So that’s it except for a few notes:

  1. As shown in the flowchart above, Process A is the “Calculation app.” It uses the MMFlogger Class to write text to the “Log.” The “Log” is implemented via a Memory Mapped File.

  2. Process B is a general purpose utility app for viewing the “Log” that the “Calculation app” is writing to.  It can be used with any Windows desktop application.  When we code our “Calculation app” we can, if we want, have it automatically start the utility “Log” viewing app (Process B). Or it can be started manually by the User.

  3. Above I stated that the second “field” in the Memory Mapped File can be thought of as a giant string. The truth is that you can NOT write or read strings to/from a Memory Mapped File. The strings must first be converted from/to byte[] arrays as part of the process and there are some hidden “gotchas” along the way. The utility Class MMFLogger that was developed as part of this project hides all that from you and me.

  4. As it turns out, reading/writing a Memory Mapped File is VERY FAST! It’s much faster than writing to TextBoxes so you should not fret over using the MMFlogger Class in your CPU Bound app.

If you want more details or the code/projects just ask.


The End

No Comments on CPU Bound Logging
Categories: Uncategorized

Statistical Primes Part 2 September 21, 2017

Statistical Primes  Part 2

Of course, the 2 helpful “prerequisites” for this article are:

Twin Primes and

Statistical Primes (part 1)



In Statistical Primes (part 1) we discussed how the TwinPrimes program originally just counted Primes and Twin Primes. Then, for the “Statistical Primes” article it was enhanced to (closely?) estimate the the counts of Primes and Twins by extrapolating from statistical samples. The technique used was a simple variation of “Stratified Sampling.” The method used there was to specify

How many strata to use and

How many elements (samples) within each stratum to process.

In particular, the elements/samples in each stratum were simply the first N numbers of each stratum. It was a very simple and easily implemented method which yielded surprisingly accurate estimates of the number of Primes and Twins.

In the prior article I noted that investigating other techniques would be left to the reader. That was kind of true… after a few days I got the bug to see if the results would be any better (or worse) if the samples, from within each of the strata , were chosen randomly. And that is what this article covers.

A button was added to the TwinPrimes program giving the user a second way to estimate the count of Primes and Twins. With that button, the samples within the strata would be chosen at random using the C# Random Class. 

It may interest you to know that the Random class only accepts int for parameters and only returns an int for the random number chosen. Initially this presented a small problem since the TwinPrimes program deals with really BIG numbers but the limitation was overcome by limiting the size of any single stratum to 2×109 (2 Billion or less… good enough for me for now).

Sample Results

In general, I suspected that randomly sampling N numbers within each stratum would yield more accurate estimates then using the first N numbers within each stratum. This was indeed the case but the accuracy differences were minimal at the low end and became much more pronounced as we moved to larger numbers. For example:

For the range of 1 → 1 Billion

When the first N numbers in each stratum are used for the sample then

The estimated count of Primes differs from the actual count of Primes by 0.24 of 1% (by 0.00236 of the actual count).

The estimate count of Twins differs from the actual count by 1.3% (by 0.0131 of the actual count).

When the numbers in each stratum are randomly chosen for the sample then

The estimated count of Primes differs from the actual count of Primes by 0.12 of 1% (by 0.0012 of the actual count).

The estimated count of Twins differs from the actual count by 0.2% (by 0.00195 of the actual count).

For the range of 1 → 1 Trillion

When the first N numbers in each stratum are used for the sample then

The estimated count of Primes differs from the actual count of Primes by 0.07 of 1% (by 0.00068 of the actual count).

The estimated count of Twins differs from the actual count by 0.92 of 1% (by 0.0092 of the actual count).


When the numbers in each stratum are randomly chosen for the sample then

The estimated count of Primes differs from the actual count of Primes by 0.04 of 1% (by 0.00041 of the actual count).

The estimate count of Twins differs from the actual count by 0.07 of 1% (by 0.00073 of the actual count).



An interesting note here. In the TwinPrimes program I chose to have the Random class/object “randomly” seeded from the system clock. As such, the estimates it gives will vary with each execution. Of course, I could have seeded the class (object) with the same number every time and the results would always be the same. But that would present its own issues. Of course, I could make it an option to do it either way but…


When I started I didn’t know whether one method would be significantly superior to the other but it appears that both methods of sampling give (I think) very good estimates. And of course, there are many other ways of “statistically” estimating the count of primes and prime related “stuff.”

Below are 2 things you may be interested in:

  1. Screen print of latest TwinPrimes program. The “GO” button simply uses brute force to calculate the number of Primes and Twins from “Start Number” thru “How many numbers to test…” The 2 “GO WITH SAMPLING” buttons estimate the counts using the techniques talked of above.
  1. There’s a signpost ahead… it’s a copy of a spreadsheet I used to track results.







The End

No Comments on Statistical Primes Part 2
Categories: Uncategorized

Use The Force Rachel September 18, 2017

Rachel… may the centripetal   force be with you.


Wait for it!…. Wait for it!…(it’s at the end).


The following is real short and is the essince.


The End

No Comments on Use The Force Rachel
Categories: Uncategorized

Apple FaceID

The video down below was on SquawkBox the other day.  It’s about the use of biometrics;  specifically facial recognition.  They have their place but don’t kid yourself… they should NOT be the ONLY thing.  The “password,”  or better still, the passphrase is still vital.  Remember, you can change a password but once your biometric data is stolen

there’s nothing you can do!  I am surprised that aspect was not discussed in the following video.  Also, we continually hear “experts” (including the one in the video below) saying that people are just too lazy or stupid to use passwords effectively.  That may be true for some but not for all!  Not all people have a problem with passwords but because some do, the so-called “experts” often say we should abandon them.  That is, that we should abandon one of the 3 pillars of identity which is “WHAT YOU KNOW.”   That’s just absurd and it shows the ignorance of some of these so-called “experts.”  Personally,  I would NOT use a device/system that did not offer the password/passphrase as at least one part of the security/identity equation.   You may also want to read a prior post/article on passwords for additional info.

Steve Jobs’  face recreated from his  stolen biometric data.  Hold this photo  up to an iPhone and you too can be Steve.


Anyway,  here’s the SquakBox video.


The End

No Comments on Apple FaceID
Categories: Uncategorized

Statistical Primes September 9, 2017

Use of

“Random” / Stratified  Sampling

To Closely Approximate

Counts of Prime Numbers and Twin Primes

And be sure to read Statistical Primes Part 2  When you are done here!

I don’t want to spend too much time on this because I’m getting tired and the hurricane is coming tomorrow night. Anyway,


While working on and writing the last article about Twin Primes,  it started to become painfully obvious that, as the numbers got exponentially larger, the run-times of the TwinPrimes program also got exponentially larger (duh!). For example:

It required 35 minutes on my PC

To compute the number of Primes and TwinPrimes

For N = 1 to 1 Billion

Now that doesn’t seem so bad until we decide we want to do that for even larger/more numbers;  or other calculations or measurements related to Primes for larger numbers. For example, if I wanted to count the Primes and TwinPrimes for N = 1 to 1 Trillion.  On my PC that would take about 27 days!  That is simply not reasonable. That led me to investigate and experiment using “statistical” methods to very quickly and very closely approximate results 💡 .   Just to jump ahead a little in order to demonstrate what I’m talking about…On my PC using my TwinPrimes program

It required 35 minutes to identify and count all of the Primes and TwinPrimes For N = 1 to 1 Billion

Then I enhanced the TwinPrimes program by adding the ability to “closely” approximate the results very quickly by using the statistical technique called “Stratified Sampling.” In the enhanced TwinPrimes program we can now optionally specify:

How many strata to use and

How many elements (samples) within each stratum to process.

Using the enhanced program, the 35 minute run-time was reduced to a mere 20 seconds! Let me repeat that… The cpu/run-time was reduced from 35 minutes to a mere 20 seconds! However, the results (the counts of Primes and Twin Primes) are now estimates based on the samples processed. Now the obvious question is, how good/accurate are the results (estimates)? Here are some results:

For N = 1 –> 1 BILLION (1 to 109)

        • The estimated count of Primes is less that ¼ of 1% different (greater) than the actual count of Primes for N = 1 to 1 Billion.
        • And, the estimated count of Twin Primes is less than 1/3 of 1%  different (greater)  than the actual count for the same range (1 to 1 Billion).

Here are a couple more interesting examples using the Stratified Sampling capabilities of the enhanced TwinPrimes program:

For N = 1 –> 1 TRILLION (1 to 1012)

Without using sampling techniques the required run-time would exceed 26 days! That’s clearly unacceptable.

If we use the sampling techniques the run-time is reduced to 78 seconds! Yup… you read that right… from 26 days down to a ridiculous 78 seconds!!!

But there is  a price for trimming that run-time from almost a month down to 78 seconds.  And that price is that now our results are estimates instead of actual counts.  However, the estimates are pretty good.


.1.  The estimated count of Primes differs from the actual count by just .07 of 1%. That is, the estimated Prime count was about 1.00068 times the actual Prime count.

.2.  The estimated count of Twin Primes differs from the actual count by .9 of 1% (that is, 9-tenths of 1%).



For N = 1 –> 1 QUADRILLION (1 to 1015)

Without using sampling techniques the required run-time would exceed 72 years! That’s way too long!.

If we use the sampling techniques the run-time is reduced to just 15 minutes! Yup… you read that right… from 72 years down to just 15 minutes!!!

But again we do pay a price for trimming that run-time from many decades down to 15 minutes.

.1.  The estimated Prime count exceeds the actual Prime count by just .28 of 1%. That is, the estimated Prime count was about 1.0028 times the actual Prime count.

.2.  The estimated Twin Primes count exceeded the actual Twin Primes count by .99 of 1% (let’s just round it to 1%).


Based on the above, using statistical sampling techniques to estimate the counts of Primes and Twin Primes seems pretty effective (depending on your requirements or goals).

It turns out there are multiple statistical techniques available to be used but I’ve tried just the one that I dreamed up and which I later learned is a simple variation on “Stratified Sampling.” In the case of using the enhanced TwinPrimes program, when using the sampling/estimating  option, we can specify how many strata to evenly divide 1–>N into and we also specify how many numbers to process in each stratum (processed sequentially starting with the first number in the stratum). This technique seemed reasonable and is predicated on the conjecture that the distribution of primes, although not “random” in the “pure” sense, is:

  • Pretty much random but the “thinning” effect of the Prime Number Theorem must always be taken into account.
  • Random enough for many purposes
  • Random enough to make effective use of common statistical techniques for “closely” estimating things like counts of Primes and Twin Primes.

Again, there are other statistical/sampling techniques that could be used (effectively?) other than the variation on Stratified Sampling that I use. I leave it to the reader to pursue that.



An important thing to think about here…


… The Prime Number Theorem (PNT) describes (quantifies) the “thinning” of the Primes as we move further out on the number line. This thinning effect is one reason I chose to use Stratified Sampling (although I didn’t know it was called that at the time). The thinking was… The more stratums (layers) we have the better we approximate (account for) the thinning effect of the Prime Number Theorem. Of course, if we go too far with this we lose the program efficiency! The obvious ultimate extreme is to have so many stratums that we end up processing EVERY number because we have a stratum for every number! All that said, with the enhanced TwinPrimes program we can specify how many stratums we want (and how many numbers to process within each stratum/layer).

We also have to specify how many elements (numbers) to process within each stratum.  But we need to choose a size that can reflect the “essense” of each stratum (layer).  For example, if we don’t process enough elements (numbers) then we can easily be misled.  For example, if we process just 1 number in a stratum, and that number is composite, the we might be led to believe that all numbers in that stratum are NOT Prime!  On the other hand, if we choose to process too many numbers/elements of a stratum then we do not realize any savings/trimming of processing time!



So that’s it.   When I started the Twin Primes project my goal was to try to definitively convince myself whether (or not) there’s  an infinity of twin primes  and hopefully how they might be distributed.   I did NOT think the project would lead to investigating statistical sampling vis-a-vis  counting Prime related phenomena and how the results further and strongly reinforce the notion of Primes being randomly distributed but within the constraints of the Prime Number Theorem.

I think this turned out to be a pretty good project.  How ’bout you?


Below are a couple of interesting things you may want to examine:

  1. A screen print of a run of Primes.exe using the Sampling/Estimating option for 1–>N = 1 BILLION along with 10,000 stratums (layers) and 1000 numbers for each stratum.
  2. A section of a spreadsheet where I recorded example results.

Click on this Screen Print for a better view.  Note: “est.”  means “estimates”


Click on the spreadsheet to get a clear view.


The End









No Comments on Statistical Primes
Categories: Uncategorized

Twin Primes September 2, 2017

The Hot Twins Project

The Twin Primes Project

I thought this blog post would just be a journal entry about the Twin Primes project during July-August of ’17.   And that, Like many of these posts,  it’s probably of minor or no interest to anyone but the writer (of course, this is true of 99.57% of all of the blogs out there).  That said… I just finished writing this post. It came out good/interesting. See if you agree.

Lock The Gates!!

This project started as simply an attempt to explore various aspects of Twin Primes in the hopes that an interesting project and/or insight would result.

For the new reader, if you are not already familiar with Twin Primes then spend just a few minutes and follow these links as they provide clear explanations and background:

The first thing done as part of this project was to modify my PrimeTest C# program to not only look for primes, but to also identify Twin Primes in the process. The vast majority of the work for identifying Twin Prime pairs was already done by the PrimeTest program merely by identifying/finding the primes. The PrimeTest program had a function coded in it that, when given an integer, would return a boolean that indicates whether or not the given number was a prime.

The name of the function was/is IsPrime() and it implements the Miller-Rabin prime testing algorithm and it uses the BinInteger class available with Microsoft’s C# and .Net libraries. The IsPrime() routine is also used in some of my other programs (I had cut and pasted the code).   Anyway, as part of this project I thought it would be a good thing to move that IsPrime() functionality to a C# class library I had earlier started building for these projects; so I did (move it that is). And updated existing programs to use the IsPrime() from the classlib.

The modified PrimeTest program was able to provide some insights into how often and when Prime Twins occur but, each additional Twin Primes result I wanted got more and more tedious to implement in the code because I had to be careful to not damage existing functionality. After some time this prompted me to write a brand new program just for Twin Primes. Of course,  the name is TwinPrimes.exe.

After constantly mulling over the notion of Twin Primes for some weeks, and gathering information much of which was the result of the running of the new TwinPrimes.exe program, I came to some conclusions.

Like others, it’s my conjecture as well that there are an infinite number of Twin Prime pairs. I come to that conclusion based on the following:

-1-  First, it’s based on the conjecture that primes are distributed essentially randomly among the odd numbers although they “thin out” according to the Prime Number Theorem   Also see 2 recent blog postings about primes that are the sum of 2 squares. and

-2- So, in general, the “density” of primes is 1/ln(N). Or in other words, the probability of an integer between 1 and N being prime is 1/ln(N). That said, and again assuming “random” distribution of primes, the probability of 2 primes landing next to each other would be “about”

(1/ln(N)) x (1/ln(N)) or 1/ln(N)2

In reality, the probability of 2 primes “randomly” landing next to each other and thus forming a Twin Prime pair is greater than the above because each time it happens we must remove those two primes from the pool of primes and we must remove the “slots they occupy” from the pool of potential landing places. With this in mind, you may want to read the following:


Of course, we know the above is not completely accurate and is somewhat overly simplified but it’s good enough for understanding why there are an infinite number of Twin Prime pairs!! It’s also good for estimating the count of Twin Prime pairs in order to reinforce our understanding based on the above principles. In addition, actually counting Twin Prime pairs using the TwinPrimes.exe program provides even more support/confirmation that the notion of estimating Twin Prime pair counts via the above principles is correct/reasonable.


The “above principles” is, again, based on “random” distribution of primes and the resulting probabilities of 2 primes “landing next to each other.”


Below is an excerpt from a spreadsheet I made with some data/results from running the TwinPrimes.exe program.

As you can see from the spreadsheet, the actual count of Twin Prime pairs exceeds the (overly?) simplified “calculated” values of 1/ln(N)2 (which was to be expected).

Following the spreadsheet is a corresponding chart comparing the actual count of Twin Prime pairs vs the calculated count (using 1/ln(N)2 ). N ranges from 1 to 1 Billion.




If I wasn’t before, I am convinced now, that the Twin Primes conjecture is true (that there are infinite Twin Prime pairs).

At the beginning of this post I didn’t think it would be very interesting. By the end (now) I think it’s definitely interesting (although many readers may (will?) think otherwise).


Open The Gates!!

No Comments on Twin Primes
Categories: Uncategorized

SumOf2Squares and Random Primes June 30, 2017


This post/article is a followup to a previous post about the Sum Of 2 Squares and how none of the terms of the formula can share prime factors. It’s required reading (but you’ll like it).


When last we left off, we had been discussing that:

1/2 of all primes are of the “SumOf2Squares” type (aka “SOTS” type”)


This ratio of 1/2 is seemingly extremely consistent

across all ranges of any significant size!


Anyway, so why are 1/2 of the primes seemingly of the “SumOf2Squares” type (aka “SOTS” type”)? Here is my conjecture (“explanation”). Follow me on this…


First of all, we know that a “SOTS prime” P is a prime that can be expressed as the sum of 2 squares; like so…

P = X2 + Y2And we also know that for this to be possible then (P – 1) must be divisible by 4. For example:

13 – 1 = 12 and 12 is divisible by 4 so 13 is a “SOTS prime” which means we can express 13 (P) as the sum of 2 squares; like so…

13 = 32 + 22

With that in mind, we might also observe that starting with 8, every other EVEN integer is divisible by 4. For example:

8, 12, 16, 20, 24, … are all divisible by 4.


From this we realize that every other ODD integer O will satisfy the first SOTS requirement of (O – 1) is divisible by 4

For example 9, 13, 17, 21, … 9-1 is divisible by 4, 13-1 is divisible by 4, and so on.


However, actually being a prime is the other/second requirement of being a “SOTS prime” and we can see that not every other ODD integer is a prime! For example, neither 9 nor 21 from the above list is a prime. But, what we’re after here is “explaining” why 1/2 of the primes are SOTS primes.


The Effect Of Random Distribution of Primes

Now, IF the primes are randomly distributed among the odd integers then we could expect as many primes P would randomly “land” on an ODD number that fullfilled the first requirement as would not. That is, it would be as likely as not that a prime P would “land” on an ODD number that met the requirement of (O – 1) being divisible by 4.   This would explain why, seemingly, and as conjectured in the prior post/article, that 1/2 of the primes are seemingly of the “Sum Of 2 Squares type.” That is, why 1/2 of the primes can be expressed as the Sum Of 2 Squares.

When I say “seemingly” it’s because it’s based on experimental evidence using the PrimeTest.exe program (see the prior post). Whew!.

It is generally thought that primes are “sort of” randomly distributed along the number line but within the “fact” that they “thin out” according to the Prime Number Theorem (PNT). Or put another way, within any significant/sizable range they are “kinda sort of pretty much” distributed randomly. We don’t know whether, or under what conditions, this “sort of nearly” random distribution falls apart. For example, does “nearly random” distribution of primes fall apart completely after 101234567890123456789? Or, conversely, could all primes greater than 101234567890123456789 be SOTS primes? Who knows. That said, at the bottom of this post are links to related articles that you may want to read.


So here is my first conjecture:

Due to the “pretty much” random distribution of primes then “pretty much” 1/2 of all primes will be SOTS primes..

The above first conjecture is pretty strong especially when using the term “pretty much.” It would probably (“pretty much”) not garner much support at the next AMS conference so let’s try this conjecture instead:


Due to the “random enough” distribution of primes even as we go to infinity, then we can say there are an infinite number of SOTS primes (an infinite number of primes that can be expressed as the Sum Of 2 Squares)..

I’m going to go with this last (second) conjecture and I’m sticking to it… but my gut tells me both are true.


The End…Except for the interesting links below.



There are many more articles to be found by searching on “random distribution of primes” or just “distribution of primes.”


New Pattern Found In Prime Numbers


Peculiar Pattern Found in “Random” Prime Numbers – Last digits of nearby primes have “anti-sameness” bias


Prime number theorem

Structure and randomness in the prime numbers


The End



No Comments on SumOf2Squares and Random Primes
Categories: Uncategorized

Gerry’s Sum Of 2 Squares Theorem June 28, 2017

Gerry’s YAPFO Theorem on 

Sum Of 2 Squares


After reading this article be sure to read the followup article explaining why 1/2 of all primes are of the “Sum of 2 Squares” variety.


.For whatever reason I seem to be fascinated with whether (or not)  the terms of various formulas have prime factors in common.  It is a subject of quite a few posts on this blog.  Anyway…


Fermat’s theorem on Sums Of 2 Squares states that an odd prime P is expressible as

P = X2 + Y2

with X and Y integers, if and only if   P  = 1 (mod 4)

That is, if P-1 is divisible by 4.

For example:

5 = 12 + 22,    41 = 42 + 52,    etc.

On the other hand, the primes 3, 7, 11, 19, etc. can not be expressed as the sum of 2 squares because

for them, P – 1 can not be evenly divided by 4.

Anyway, I recently stumbled into the above theorem via an excellent YouTube video on the Numberphile channel. Take a look (link below).

Does the guy in the following Numberphile video look like Max Von Sydow or what?



My initial thought was that the Sum Of 2 Squares theorem might make an interesting project with regard to which terms of the formula do (or do not) share prime factors (i.e. which terms might have prime factors in common). It would be another YAPFO project (YAPFO = Yet Another Prime Factor Oddity). Of course, I would start out by writing a program to show details for which candidate primes have terms with 1 or more prime factors in common.

P = X2 + Y2

Almost immediately it dawned on me that the “P” term could never share a prime factor with the other terms (X and Y) because P is prime, and X and Y are less than P.

The above immediately led to … Ok… so if P won’t be sharing a prime factor with either X or Y then the problem is reduced to whether or not X can share a prime factor with Y.

Ok. This could still be an interesting project with some coding. A couple of minutes later while making coffee it dawned on me that the X and Y terms could not possibly share any prime factors either. If they did share a prime factor we would end up with something like the following:

P = (Fc * Fx2 * Fx3.. * Fxn)2 + (Fc * Fy2 * Fy3.. * Fyn)2

where F? Are the prime factors of X and Y. And in particular, Fc is a prime factor common to both X and Y.

If there actually was a Fc common to both the X and Y terms then we can get the following…

P / Fc = ( Fx2 * Fx3.. * Fxn )2 + ( Fy2 * Fy3.. * Fyn )2

Non-Integer = Integer

which is a contradiction that tells us that there are no prime factors shared by both X and Y. So…Gerry’s theorem on



The Sums Of 2 Squares states that

for every odd prime P expressible as

P = X2 + Y2

There are not any prime factors common to

any of the terms for P, X, or Y.



So the whole episode was conceived as a YAPFO project but resolved itself in less than an hour so I guess we can’t really call it a “project.”   Oh well.   But it was still worth documenting.


Here are some related links you may want to follow:


The End

Breaking News!

It’s been 2 days since I first posted the above.   But for some reason I kept thinking about 

What portion of all primes could be expressed as

the sum of  2 squares (let’s call them “SOTS” primes):


P = X2 + Y2.

It should be rather simple to figure this out since I already had a program I’d written called PrimeTest.exe. I used that program for an article written almost 1 year ago called “Primality Testing For Huge Integers” that you can read about here:


For PrimeTest.exe we specify a starting number, and how many numbers to examine/test.    It will return summary stats about how many of the integers in that range are prime. And it will, if you want, also provide details about each of those primes in the range. It does other stuff too. Anyway, all I did was to modify the PrimeTest.exe code to:


.1.  Identify which primes in the selected range could be be expressed as the sum of 2 squares ( P = X2 + Y2 ). I.e. where (P – 1) is evenly divisible by 4.

.2.  At the end of each “test,” print stats on how many of the primes in the range were “SumOf2Squares” primes (SOTS primes).

.The testing results were extremely interesting although I have no explanation for them. As it turns out :


1/2 of the primes are of the “SumOf2Squares” type (“SOTS” type”)


This ratio is extremely consistent across all ranges of any significant size!


I found the above extremely interesting but also very odd. Based on experience, I was expecting the number of SOTS primes to vary in some way with the natural log of the count of all primes in general or to vary with the log of the size/value of a prime(s) .

Again, 1/2 of the primes are of the SOTS variety and this is extremely consistent and varies little!   But why this should be the case is a real mystery; and a very interesting one at that! Solving this mystery could make for a great project.


After reading this article be sure to read the followup article linked to just below:

why 1/2 of all primes are of the “Sum of 2 Squares” variety.


As a final note, below is an example of using the PrimeTest.exe program:



































No Comments on Gerry’s Sum Of 2 Squares Theorem
Categories: Uncategorized

Comey Was Raped June 12, 2017


James Comey is an imposing man. He’s 6’8″ tall with an athletic build. And he was the head of the FBI!!! – Arguably the most powerful law enforcement agency in the world and almost certainly in the U.S. Then he was raped by a 70 year old fat man when he was alone in a room with him. Well… maybe it was a figurative raping; but the feelings of inadequacy and shame were the same.

“I should have done more. I wish I had been stronger. I should have said ‘no.’  I shouldn’t have gone to the 2nd and 3rd meetings.”


Anyway… below are the contents of 2 opinion pieces recently published by the New York Times; they echo what I’ve said above (and then some).

Kudos to the NYT and the 2 brave sisters who connected the dots and shined a bright light on the real meaning of the recent Comey testimony. Read them below (or click on the link to the PDF reprint of the NYT web pages that you can find at the end of this article).


The Truth Will Set Us Free

And it’s in the New York Times OpEd section


First NYT Opinion Piece


As I listened to James B. Comey, the former F.B.I. director, tell the Senate Intelligence Committee about his personal meetings and phone calls with President Trump, I was reminded of something: the experience of a woman being harassed by her powerful, predatory boss. There was precisely that sinister air of coercion, of an employee helpless to avoid unsavory contact with an employer who is trying to grab what he wants.

After reading Mr. Comey’s earlier statement, I tweeted about this Wednesday night, and immediately heard from other women who had seen that narrative emerge. How recognizable it was that Mr. Comey was “stunned” to find himself in these potentially compromising positions. His incredulity, mixed with President Trump’s circling attempts to get his way, were poignant. For a woman who has spent a lifetime wrestling with situations where men have power they can abuse, this was disturbingly familiar.

On Jan. 27, Mr. Comey received a last-minute dinner invitation from the president, and then learned it would be “just the two of us.” On Thursday, Mr. Comey revealed that he had had to break a date with his wife in order to dine with Mr. Trump. Already, something about this “setup” made him “uneasy.”

The central business of this intimate dinner was Mr. Trump’s insistence: “I need loyalty, I expect loyalty.” Mr. Comey immediately recognized that this was a press for something he did not want to give. He froze: “I didn’t move, speak, or change my facial expression in any way during the awkward silence that followed.”

That reaction — the choice of stillness, responses calculated to neither encourage nor offend that characterized so many of his dealings with Mr. Trump — is so relatable for any woman. During his testimony, Mr. Comey was asked why he had not responded more robustly, why he had not told Mr. Trump that he, the president, was acting inappropriately or reported his behavior immediately to others in authority.

Mr. Comey expressed regret that he had not been “stronger” about it, but explained that it was all he could do to focus on not saying the wrong thing. In other words, he wanted to avoid granting any favor while avoiding the risk of direct confrontation — a problem so deeply resonant for women.

During that interminable, awkward dinner, Mr. Comey struggled to convince Mr. Trump of the danger of “blurring” boundaries. But Mr. Trump was not deterred and returned to the subject of the loyalty he must have. There you hear the eternal voice of the predatory seducer: the man who knows how hard he can make it for a woman to refuse his needs.

Mr. Comey tried to wriggle out of the trap being set for him. He offered his “honesty,” hoping this would appease his insatiable host. Mr. Trump countered with a demand for “honest loyalty.” Mr. Comey acquiesced. Yet as he documented this “very awkward conversation,” his concession of this phrase troubled him. He hoped he had not been misunderstood by the president.

The victim of sexual harassment is constantly haunted by the idea that she said or did something that gave her persecutor encouragement. Serial harassers, of course, have an intuitive sense of this, and are skilled at manipulating and exploiting it.

Mr. Comey, you are not alone. How many of us have played over and over in our minds an encounter that suddenly took a creepy, coercive turn? What did I say? Were my signals clear? Did I do something ambiguous? Did I say something compromising?

At a White House ceremony on Jan. 22, Mr. Comey reportedly tried to blend in with the curtains, so that he would not be noticed by the president. Mr. Trump called to him and pulled him, unwilling, into a hug. What woman has not tried to remain invisible from an unwelcome pursuer’s attentions?

To this series of bizarre interactions, in which he faced escalating pressure, Mr. Comey reacted with rising anxiety and distress. Time after time, Mr. Trump reverted to his questionable agenda, and Mr. Comey, at each pass, tried to parry the president’s unwanted advances.

This dynamic with the president became so disturbing to Mr. Comey that, after an Oval Office meeting in February, he implored the attorney general, Jeff Sessions, “to prevent any future direct communication between the president and me.” Mr. Comey did not want to be left alone with his boss again. We’ve been there, Jim.

In their final exchange, on April 11, Mr. Trump told the F.B.I. director, “I have been very loyal to you, very loyal; we had that thing you know.” On May 9, having rebuffed the president, Mr. Comey was fired.

“We had that thing.” Once more, the seducer asserts a shared intimacy that was not really there, attempting to ensnare his victim with an imputed complicity.

In the infamous “Access Hollywood” tape, Mr. Trump said of any woman he wanted: “I just start kissing them. It’s like a magnet. Just kiss. I don’t even wait. And when you’re a star, they let you do it. You can do anything.” And he added: “Grab ’em by the pussy. You can do anything.” With the power of the presidency at his disposal, Mr. Trump thought that he could use the psychology of coercive seduction on the nation’s chief law enforcement officer.

Victims of sexual harassment often face skepticism, doubts and accusations when they tell their story. That’s part of the predator’s power. But I’m here to tell James Comey, and all the women and men who have suffered at the hands of predators, I believe you



2nd Piece By NYT

Women Say to Comey: Welcome to Our World


A man is being publicly grilled about why he was alone in a room with someone he felt was threatening him. Why didn’t he simply resign if he felt uncomfortable with what his boss was asking him to do? Why did he keep taking calls from that boss, even if he thought they were inappropriate? Why didn’t he just come out and say he would not do what the boss was asking for?

Sound familiar? As dozens of people noted immediately on Twitter, if you switch genders, that is the experience of many women in sexual harassment cases. James Comey, the former director of the F.B.I., explained to senators during today’s hearing that he felt acutely uneasy and hesitant to directly confront his boss, the president of the United States. That’s right, even a savvy Washington insider, the same height as LeBron James and no stranger to the cut and thrust of power, seemed slightly ashamed that he had not been able to do so.

“Maybe if I were stronger, I would have,” he said, trying to answer a question about why he didn’t speak his mind. “I was so stunned by the conversation that I just took it in.”

These are the emotions that many women have struggled to explain in the face of sexual harassment, and the ones that have often given defense attorneys grist for what appear to be inconsistencies.

Imbalance of power often lies at the heart of sexual harassment or assault cases, from those of Roger Ailes and Bill O’Reilly at Fox News to the trial of Bill Cosby, underway the same day as the hearing of the Senate Intelligence Committee. On Wednesday, Andrea Constand, Mr. Cosby’s accuser, concluded two days on the witness stand, with defense attorneys suggesting that her continued contacts with Mr. Cosby undermined her credibility. Unsurprisingly enough, today’s hearing shows that power can discomfit and silence men as well as women.

Sexual harassment and assault often provoke debates about credibility, fairness and bias. But at least for today, the tables were turned, and men could glimpse what women have often endured.



Here are links to PDFs that are reprints of the NYT articles from their web site.



The End


















No Comments on Comey Was Raped
Categories: Uncategorized