Reality - Version 0.1

Tennis.

James

I nearly forgot . . .

I need you to elaborate. Are you talking about projection? I ask because I use this word in a number of domains including mapping and memory. Orthogonality also comes to mind. I also thought that you might have been referring to an orthogonal array. I agree to the simple and obvious comment - I also think that simple and obvious is a wonderful thing once I start compounding - that way I know what a complex system is about.

James

My goodness :blush: sorry.

James

If it is not too much trouble, I still require your thoughts on this post.

It is important to me that we are sharing the same terminology.

Thank you in advance.

:smiley:

Sorry, I merely meant it in the common sense of right angles, as in a Cartesian coordinate system (“ortho-” meaning “proper/right/straight/standard/authentic”).

James

Starting out simple and obvious . . . on the journey to n-dimensions. This post might not serve any purpose to you - but it will to me.

In that case I think you are speaking of orthogonality and the term orthogonal. Where -gon comes from Greek -gōnos ‘-angled’, and -al (forming adjectives) relating to; of the kind of. Orthogonal: of or involving right angles; at right angles.

I am not certain whether it can be called an orthogonal array because I think that is a combinatorial construct in mathematics. I call it a multidimensional array or more specifically a two dimensional array(shortened to 2D Array). In my 2D Array the foundation address is represented as the segment and the indices are the columns. This is not how it maps to random access memory - that is base address and offsets.

The way I set up my 2D Array was with the intention of being as you say, simple and obvious. We can put this matter to rest for now.

On Tennis: I might be able to come up with something more optimized - let me ponder Tennis and optimization for two or three days. In the meantime I do have some more questions for you, if you are up to answering them.

8-[

Let the angle and velocity (always velocity of 1 in this case) determine the xyz coordinance, not the other way around. After time t= t+1, x= x+dx, y= y+dy, z= z+dz. In that way, every conceivable angle is handled while the PtA remains the same, propagating. The arrays are no longer arrays concerning fixed locations with their associated PtA level. Instead, the arrays are array-lists of original PtAs and their consequent, double-decimal locations. Array1 is the last list of PtA xyz-locations, and array2 is the next list of PtA xyz-locations, and reversed as t increments - tennis ball (or very many) being tracked through spacetime. Orthogonality is no longer relevant.

So emmm… did you follow that? :-k

James

Do you mean angle and speed? You are welcome to talk physics you know. I will take it that you mean speed - in which case that is not a problem.

Well okay, usually “velocity” is angle plus speed, so I misspoke a bit. Yes, “angle + speed” is what I was talking about. And that angle is to be expressed as the 3 Cartesian coordinates of dx, dy, and dz. The “speed” is to always be “1” (of whatever units of time - one tic of the clock). Within each step of 1 tic of the clock, the PtA “tennis ball” bundle travels dx, dy, and dz from where ever it was. The trick to get you out of the conundrum of geometry, is to simply keep track of each tiny bit of PtA (aka “an Afflate”) as though it was somewhat like a tiny tennis ball passing through space. And you have billions of them passing through each other (by specific rules of engagement).

Void of the actual rules of engagement, the result in pictorial form, would look something like this:

That is a pictorial of “empty space”.

The end result using the proper rules of engagement displays the cause of the formation of subatomic particles, their behavior, and the entire rest of the universe.

James

I think so.

Let Abit/s = affectance bit/s : Let N = any particular Abit in the array : Let the symbol “&” be the array separator.

With two arrays we are tracking many Abits:

PtA(N) & PtA(N)

x(N) & dx(N)
y(N) & dy(N)
z(N) & dz(N)
Speed is one clock cycle(theoretically equating to the speed of light in physical space) represented as a tic.
Angle is randomly preset at the first moment of populating the Abit - and updated as the “difference” between n & dn where n can be x,y or z.
Currently void of the actual rules of engagement between Abits.

Each tic, n is assigned the value of dn - through the switching of the primary array.

Well sorry, from from that I can’t tell if you understood or not. Let me see if I can remember enough BASIC to convey the idea (always comes down to the basics - Clarify, Verify,…).

DIM AryP(1000000) as double ; PtA level for each afflate
DIM Arydx(1000000) as double ;dx component of vector
DIM Arydy(1000000) as double ;dy
DIM Arydz(1000000) as double ;dz
;
DIM Ary1dx(1000000) as double ;dx1 current location of each afflate
DIM Ary1dy(1000000) as double ;dy1
DIM Ary1dz(1000000) as double ;dz1
;
DIM Ary2dx(1000000) as double ;dx2 next location of each afflate
DIM Ary2dy(1000000) as double ;dy2
DIM Ary2dz(1000000) as double ;dz2

;
; initialize random PtA levels for all afflates to be between -1 and +1
;
For n = 1 to 1000000
_AryP(n) = (RND() - 0.5) * 2
next n
;
; Initialize original array of 1000000 afflates with their vector components normalized to v = 1
;
For n = 1 to 1000000
_Aryx(n) = int(RND()) - 0.5
_Aryy(n) = int(RND()) - 0.5
_k = RND() - 0.5: If k > 0.5 then k = 1 else k = -1 ; acquire random +/- 1
_Aryz(n) = (1 - (Aryx(n)^2 + Aryy(n)^2))^0.5 * k ; normalize so that velocity = 1
next n
;
; Initialize first array
;
For i = 1 to 1000000
_Ary1x(i) = Aryx(i)
_Ary1y(i) = Aryy(i)
_Ary1z(i) = Aryz(i)
next i
;
; Begin sequencing through time
;
For t= 1 to 10,000,000 ; number of tics to watch
;
; Calculate next tic array1-to-array2 situation
;
_ For n = 1 to 1000000
__Ary2x(n) = Ary1x(n) + Aryx(n)
__Ary2y(n) = Ary1y(n) + Aryy(n)
__Ary2z(n) = Ary1z(n) + Aryz(n)
_next n
;
Test and Correct for Boundaries
Display array2
;
;
; Calculate next tic array2-to-array1 situation
;
_ For n = 1 to 1000000
__Ary1x(n) = Ary2x(n) + Aryx(n)
__Ary1y(n) = Ary2y(n) + Aryy(n)
__Ary1z(n) = Ary2z(n) + Aryz(n)
_next n
;
Test and Correct for Boundaries
Display array1
;
next t ; go through all 10,000,000 tics displaying array 1 then array 2 back and forth

The end result of that should appear something like:

capisci?

James

Yes, I am certain that I understand you. Your code is fundamentally representing the same idea I put in my last post. We are writing the same thing two different ways. I think it is important for the each of us to understand the other, wouldn’t you agree?

I tell you what I will do next . . . I will find the middle ground between your last post and my last post, then I will write an example with comments that you should be able to understand. I will also use BASIC to illustrate(always comes down to the basics - Clarify, Verify,…). Your BASIC is quite understandable.

I believe a little patience is required here on both our parts. I only require the answers to the following questions:

1 ► I think it is important for the each of us to understand the other, wouldn’t you agree?

2 ► Do you understand the programming constructs procedures and functions? They are similar.

3 ► Do you agree that patience is also required? Patience and Clarify, Verify,…

Don’t panic - the clearer we communicate the less time we will spend being unnecessarily patient.

You are asking a rock to please have patience. No need for such extreme, extreme, consideration. A little is fine. Don’t get carried away. And in my case, simply get on with it, assuming that I am NOT judging, but merely trying to make progress when it is available to be made (although the sooner the better. I am not going to live forever).

James

OK, that answers my questions in a nutshell.

First of all - I can cut the amount of arrays you have from ten down to two. I can do this because an array is just a base address(pointing to a location in memory) and the amount of required offsets(other memory locations that are relative to the base address). Between each offset I can define as much memory as I want to use as long as it remains uniform between all offsets. That aside I can create a defined uniform structure between each offset to hold all of our variables.

Structure BothArrays ; using double precision(represented by the .d after each variable) PtA.d x.d y.d z.d EndStructure
And because each array is fundamentally the same we can use the same structure. Oh and yes it is easy to replicate your empty space.

Yeah, the whole module-coding thing is fine. What matters is only that you understand the fundamental intent. I suspect that you are going to have to change a little what you have so far, but that’s fine. Do what works in your language.

Nothing counts until you display an actual product (unassociated to whatever code you were using).

We are discussing (for those perhaps unaware) one particular model of reality, an “ontology”, being emulated by a computer so as to verify its veracity. A great many people propose possible models of reality (such as FC and Eugene). But now we are talking about “rubber meets the road” analysis. Does this RM:Affectance Ontology actually represent anything real or significant? I personally know that answer, but we are about to find out if someone else can.

James

I have no intention of getting side tracked on particulars.

Yes, I figured the whole module-coding thing would make the emulator much easier to understand under the hood. As far as PtA is concerned I am certain that I have demonstrated in the earlier examples that I fully grasp what it does now - in other words - it takes on the potential to affect and passes on the potential to affect.

What I have been trying to achieve the last couple of days is to transition from the philosophy to the physics and code. You have been a little difficult to work with but that is your personality and I am fine with that - I like you and I like your philosophy so I will keep working at this - displaying empty space is the easy part - understanding PtA is fundamental - the rules of engagement are the all important things that are going to bring everything together.

We could do this in any language - I get that. Displaying empty space is very easy. An actual product is only going to show that my code works from my point of view.

I know what we are discussing. I have seen many models of reality. The idea of what we are doing is to get “rubber meets the road”. I have put a lot of time and thought into RM:AO because I personally have a feeling about the answer already - if I had to hazard a guess I would say what you know and what I am guessing are one and the same. Well I can do this easily - the code part is the easy part.

We are going from philosophy to emulator. The road from point A to point B only has a limited amount of short cuts.

Well let me know the particulars and maybe I can adjust some of that.

James

That is nice of you to offer James, but I think you are fine the way you are. I really appreciate you helping me out with the PtA thing. I am going over everything now and considering rules of engagement for the affectance bits - I think these are called afflates from memory. I am going to start by looking at density.

I am going to be posting some stuff to myself that includes code, philosophy and numbers - I just wanted to let you know in advance. I will address these particular posts to myself and the posts to you will be marked as such - they will likely be requests for verification’s and a question here and there.

Arcturus Descending

I think that what it means to exist, is to have objective reality, to occur, to be affected by the surrounding existence and to affect the surrounding existence.

There will never be any such thing as a perfect ideal world, however it is a concept that can be a great tool when considering these things, providing the parties involved know how to use it. Even dead bodies exist, so non-existence remains a shallow concept.

By existence, I mean everything. It is surprising how much of it we can see objectively - it is enough for us to be able to guess with a highly probable degree of accuracy, what else may lay beyond our metaphorical reach.

We will never have all of the facts - not while things keep changing - when things stop changing, what need will we have of all the facts? We do not need all of the facts to see much of existence objectively, we just need the right facts - the facts that are relevant to us.

We may see with different eyes, and we may not truly know each others experiences, but we do know how to measure frequencies, and we understand a lot about energy and matter, and these objective facts can be verified over and over again by different people to get the same results. When objective results are different from what we know then either something has changed or we have made an error.

I am glad that we somewhat agree that part of what it means to exist is to be affected by the surrounding existence and to affect the surrounding existence. I have always figured that there is meaning that exists between the question and the answer. That the answer is waiting to be found so therefore the answer comes before the question. We are so used to the confinement of the answer coming after the question that we make the dangerous assumption that we cause the answers.

We do not really invent - we just configure.