JsonGrinder is a collection of routines that facilitates conversion of JSON documents into structures used by Mill.jl project.
Motivation
Imagine that you want to train a classifier on data looking like this
{
"ind1": 1,
"inda": 0,
"logp": 6.01,
"lumo": -2.184,
"mutagenic": 1,
"atoms": [
{
"element": "c",
"atom_type": 22,
"charge": -0.118,
"bonds": [
{
"bond_type": 7,
"element": "c",
"atom_type": 22,
"charge": -0.118
},
{
"bond_type": 1,
"element": "h",
"atom_type": 3,
"charge": 0.141
},
{
"bond_type": 7,
"element": "c",
"atom_type": 22,
"charge": -0.118
}
]
},
⋮
{
"element": "c",
"atom_type": 27,
"charge": 0.012,
"bonds": [
{
"bond_type": 7,
"element": "c",
"atom_type": 22,
"charge": -0.118
},
{
"bond_type": 7,
"element": "c",
"atom_type": 27,
"charge": -0.089
},
{
"bond_type": 7,
"element": "c",
"atom_type": 22,
"charge": -0.118
}
]
}
]
},
and the task is to predict the value in key mutagenic
(in this sample it's 1
) from the rest of the JSON.
With most machine learning libraries assuming your data being stored as tensors of a fixed dimension, or a sequence, you will have a bad time. Contrary, JsonGrider.jl
assumes your data to be stored in a flexible JSON format and tries to automate most labor using reasonable default, but it still gives you an option to control and tweak almost everything. JsonGrinder.jl
is built on top of Mill.jl which itself is built on top of Flux.jl (we do not reinvent the wheel). Although JsonGrinder was designed for JSON files, you can easily adapt it to XML, Protocol Buffers, MessagePack, and other similar structures
There are 5 steps to create a classifier once you load the data.
- Create a schema of JSON files (using
sch = JsonGrinder.schema(...)
). - Create an extractor converting JSONs to Mill structures (
extractor = suggestextractor(sch)
).
Schema sch
from previous step is very helpful, as it helps to identify, how to convert nodes (Dict
, Array
) to (Mill.ProductNode
and Mill.BagNode
) and how to convert values in leaves to (Float32
, Vector{Float32}
, String
, Categorical
).
- Create a model for your JSONs, which can be easily done by (using
model = reflectinmodel(sch, extractor,...)
) - Extract your JSON files into Mill structures using extractor
extractbatch(extractor, samples)
(at once if all data fit to memory, or per-minibatch during training) - Use your favourite methods to train the model, it is 100% compatible with
Flux.jl
tooling.
Steps 1 and 2 are handled by JsonGrinder.jl
, steps 3 and 4 by combination of Mill.jl
JsonGrinder.jl
and the 5. step by a combination of Mill.jl
and Flux.jl
.
Authors see the biggest advantage in the model
being hierarchical and reflecting the JSON structure. Thanks to Mill.jl
, it can handle missing values at all levels.
Our idealized workflow is demonstrated in following example, which can be also found in Mutagenesis Example and here we'll break it down in order to demonstrate the basic functionality of JsonGrinder.
The basic workflow can be visualized as follows
Mutagenesis Example
Following example demonstrates learning to predict the mutagenicity on Salmonella typhimurium (dataset is stored in json format in MLDatasets.jl for your convenience).
This example is also available as a Jupyter notebook, feel free to run it yourself: mutagenesis.ipynb
This example is taken from the CTUAvastLab/JsonGrinderExamples and heavily commented for more clarity.
Here we include libraries all necessary libraries
using JsonGrinder, Mill, Flux, MLDatasets, Statistics, Random, JSON3, OneHotArrays
we stabilize the seed to obtain same results every run, for pedagogic purposes
Random.seed!(42)
Random.TaskLocalRNG()
We define the minibatch size.
BATCH_SIZE = 10
10
Here we load the training samples.
dataset_train = MLDatasets.Mutagenesis(split=:train);
x_train = dataset_train.features
y_train = dataset_train.targets
100-element Vector{Int64}:
1
1
0
1
1
1
1
1
1
1
1
1
1
0
0
1
0
0
1
0
1
1
0
1
0
1
1
0
0
1
1
1
1
0
1
0
0
0
0
1
0
1
1
0
0
0
1
0
0
0
0
1
1
0
1
1
1
0
1
1
1
0
0
0
1
1
1
1
0
1
1
1
1
1
1
1
0
1
0
1
1
1
1
1
0
1
1
1
0
1
1
0
0
1
1
0
0
1
0
0
This is the step 1 of the workflow
We create the schema of the training data, which is the first important step in using the JsonGrinder. This computes both the structure (also known as JSON schema) and histogram of occurrences of individual values in the training data.
sch = JsonGrinder.schema(x_train)
[Dict] # updated = 100
├─── lumo: [Scalar - Float64], 98 unique values # updated = 100
├─── inda: [Scalar - Int64], 1 unique values # updated = 100
├─── logp: [Scalar - Float64,Int64], 62 unique values # updated = 100
├─── ind1: [Scalar - Int64], 2 unique values # updated = 100
╰── atoms: [List] # updated = 100
╰── [Dict] # updated = 2529
├──── element: [Scalar - String], 6 unique values # updated = 2529
├────── bonds: [List] # updated = 2529
│ ╰── [Dict] # updated = 5402
│ ┊
├───── charge: [Scalar - Float64], 318 unique values # updated = 2529
╰── atom_type: [Scalar - Int64], 28 unique values # updated = 2529
This is the step 2 of the workflow
Then we use it to create the extractor converting jsons to Mill structures. The suggestextractor
is executed below with default setting, but it allows you heavy customization.
extractor = suggestextractor(sch)
Dict
├─── lumo: Categorical d = 99
├─── inda: Categorical d = 2
├─── logp: Categorical d = 63
├─── ind1: Categorical d = 3
╰── atoms: Array of
╰── Dict
├──── element: Categorical d = 7
├────── bonds: Array of
│ ╰── Dict
│ ┊
├───── charge: Float32
╰── atom_type: Categorical d = 29
This is the step 3 of the workflow, we create the model using the schema and extractor
Create the model
We create the model reflecting structure of the data
encoder = reflectinmodel(sch, extractor, d -> Dense(d, 10, relu))
ProductModel ↦ Dense(50 => 10, relu) # 2 arrays, 510 params, 2.070 KiB
├─── lumo: ArrayModel(Dense(99 => 10, relu)) # 2 arrays, 1_000 params, 3.984 KiB
├─── inda: ArrayModel(Dense(2 => 10, relu)) # 2 arrays, 30 params, 200 bytes
├─── logp: ArrayModel(Dense(63 => 10, relu)) # 2 arrays, 640 params, 2.578 KiB
├─── ind1: ArrayModel(Dense(3 => 10, relu)) # 2 arrays, 40 params, 240 bytes
╰── atoms: BagModel ↦ BagCount([SegmentedMean(10); SegmentedMax(10)]) ↦ Dense(21 => 10, relu) # 4 arrays, 240 params, 1.094 KiB
╰── ProductModel ↦ Dense(31 => 10, relu) # 2 arrays, 320 params, 1.328 KiB
├──── element: ArrayModel(Dense(7 => 10, relu)) # 2 arrays, 80 params, 400 bytes
├────── bonds: BagModel ↦ BagCount([SegmentedMean(10); SegmentedMax(10)]) ↦ Dense(21 => 10, relu) # 4 arrays, 240 params, 1.094 KiB
│ ╰── ProductModel ↦ Dense(31 => 10, relu) # 2 arrays, 320 params, 1.328 KiB
│ ┊
├───── charge: ArrayModel(identity)
╰── atom_type: ArrayModel(Dense(29 => 10, relu)) # 2 arrays, 300 params, 1.250 KiB
this allows us to create model flexibly, without the need to hardcode individual layers. Individual arguments of reflectinmodel
are explained in Mill.jl documentation. But briefly: for every numeric array in the sample, model will create a dense layer with neurons
neurons (20 in this example). For every vector of observations (called bag in Multiple Instance Learning terminology), it will create aggregation function which will take mean, maximum of feature vectors and concatenate them. The fsm
keyword argument basically says that on the end of the NN, as a last layer, we want 2 neurons length(labelnames)
in the output layer, not 20 as in the intermediate layers. then we add layer with 2 output of the model at the end of the neural network
model = Dense(10, 2) ∘ encoder
Dense(10 => 2) ∘ ProductModel ↦ Dense(50 => 10, relu)
This is the step 4 of the workflow, we call the extractor on each sample
We convert jsons to mill data samples and prepare list of classes. This classification problem is two-class, but we want to infer it from labels. The extractor is callable, so we can pass it vector of samples to obtain vector of structures with extracted features.
ds_train = extractor.(x_train)
100-element Vector{ProductNode{NamedTuple{(:lumo, :inda, :logp, :ind1, :atoms), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bonds, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bond_type, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}}}, Nothing}}:
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
This is the step 5 of the workflow, we train the model
Train the model
Then, we define few handy functions and a loss function, which is logit binary crossentropy in our case. Here we add +1 to labels, because the labels are {0,1} and idxmax of the model output is in the {1,2} range.
loss(ds, y) = Flux.Losses.logitbinarycrossentropy(model(ds), OneHotArrays.onehotbatch(y .+ 1, 1:2))
accuracy(ds, y) = mean(Flux.onecold(model(ds)) .== y .+ 1)
accuracy (generic function with 1 method)
We prepare the optimizer.
opt = AdaBelief()
ps = Flux.params(model)
Params([Float32[-0.14937238 0.30904353 -0.41546988 0.3405928 0.6001251 0.5224303 -0.092750885 0.45076948 -0.20876926 0.39260277; -0.6390362 -0.10578741 0.124983445 -0.59301573 0.69497174 -0.7057435 0.004279422 -0.23132053 -0.22353533 -0.038621854], Float32[0.0, 0.0], Float32[-0.096426584 0.07578926 -0.06996167 -0.09683977 -0.18165171 -0.21285167 -0.18779413 -0.16627313 -0.18487893 -0.16527334 -0.11375091 0.21801732 -0.12990144 -0.112366766 -0.23416406 0.22334129 0.1458529 0.19612898 -0.09759484 -0.15168947 -0.16360721 -0.027514614 -0.039260298 0.20932846 -0.22651687 0.22665685 0.035197116 -0.17270982 -0.15612908 -0.22658098 -0.02221678 0.18217307 -0.1564237 -0.11346426 0.032175235 -0.17583966 0.11603635 -0.0030727547 0.11360248 0.17170793 0.08548008 0.11031378 0.065614745 -0.13627854 -0.20864525 -0.10172736 0.082583584 -0.21782163 0.12655717 0.0138090765 0.03547233 -0.08930749 0.10285816 -0.19322488 -0.012325477 -0.13909414 -0.106226794 -0.2163731 0.054409776 -0.16964208 -0.16506134 0.05264383 -0.21107616 0.0529773 -0.04251331 -0.18875185 -0.19109201 -0.101429656 0.22793299 0.024720343 0.040361594 -0.20213515 0.14131917 0.19981828 0.14099272 0.130157 -0.05187441 0.023193 0.1616941 -0.08250345 -0.111259736 0.040550355 -0.21764277 0.03890087 -0.02437135 -0.2019517 0.09136347 0.13882263 0.05816986 -0.10446782 0.20230137 0.0713801 0.11439332 0.09066509 0.11606496 -0.23421893 0.21728946 0.01820106 -0.037255753; -0.22306022 -0.2256287 0.1062214 0.027228577 -0.041388158 0.114647396 0.06919194 0.050866026 -0.041612383 -0.026956134 0.10877363 0.09125148 -0.036334883 0.15662871 0.121312924 -0.23312163 -0.043826554 0.23258606 0.1277648 -0.1533868 0.0534481 0.20202632 0.19420247 0.057553545 -0.15457419 0.089874774 0.13157682 0.16342606 -0.0887624 -0.10784051 -0.09284606 0.045813955 -0.14214355 0.054875538 -0.010774751 0.14675139 0.064102896 -0.15840603 -0.04745625 0.028785316 -0.09101979 -0.07820855 0.19675691 -0.10171816 0.10500865 -0.075222805 -0.05984175 0.033138283 -0.18911123 -0.08555322 0.12829392 -0.08801069 0.11460415 0.12826598 0.05974162 0.064277194 -0.16104302 0.09324308 -0.22886446 -0.15465826 0.16408378 0.2094592 -0.0011568699 0.007632634 -0.13190548 -0.17918576 0.16805784 -0.19866624 0.11173247 0.1371009 -0.0058231982 -0.22235927 0.05589156 -0.10383021 -0.15299761 0.21781486 -0.01470044 0.043741893 0.08132675 0.124703765 -0.029354926 -0.08596872 -0.028917328 -0.07845585 0.027632669 0.05748502 -0.002175742 0.14704543 -0.14754482 0.02064793 -0.16803922 0.0034726793 -0.19662322 0.16003226 0.22275832 -0.2000003 0.13838883 0.032021772 -0.103447095; 0.14084035 -0.15627764 -0.03206907 0.14527056 -0.20607188 -0.21399716 -0.19247118 0.14314944 0.108806856 0.09136817 -0.092668094 -0.19743906 0.115276605 0.09233191 -0.016215757 0.17352673 0.20168991 0.1100575 -0.064422466 0.118994154 -0.13117863 0.013815762 -0.055786926 0.020837922 0.1369123 -0.12245542 -0.05324443 0.023170372 -0.22117475 -0.09103355 -0.21830675 0.19399236 0.09012339 -0.16445369 0.07174327 -0.10157238 0.13695459 -0.16935025 0.06573526 0.22557327 -0.10754121 0.01614066 -0.09065622 -0.22848853 0.2314067 -0.2231013 -0.03870042 0.09106353 -0.06974242 -0.07639766 0.08417898 0.036457498 -0.14673519 0.10474275 0.14101236 -0.12201695 -0.12937771 -0.22108226 -0.21828146 -0.063776776 -0.046298765 -0.091449246 -0.098083004 -0.14958079 -0.15192099 0.036900073 -0.0060454933 -0.008874025 -0.14217566 0.062065456 0.11980978 -0.035791002 0.12894467 0.14897065 -0.12606137 -0.21980391 0.19322947 -0.110509925 0.08782193 -0.034668732 0.054194447 0.05171015 0.033847373 0.1873491 0.15852843 -0.19536678 -0.18093948 -0.09292418 -0.12030999 -0.19003242 0.13714282 0.08367459 -0.121257044 -0.0030244528 -0.21305372 -0.07766747 0.05534231 -0.13980508 0.10496449; 0.16756152 0.017439555 0.22197641 -0.18188474 0.014141457 -0.069731176 0.14450498 -0.1704676 0.18097402 0.010327447 -0.1598846 -0.13129418 -0.1503061 0.053810854 -0.17797746 0.06611866 0.09728626 -0.23457393 0.17519315 0.113242276 -0.1231774 0.07568479 -0.14736615 0.058907818 -0.11776873 -0.035351112 -0.13073812 0.14746423 -0.13964015 -0.20171945 -0.06343254 -0.08648816 -0.22289325 -0.17102675 0.013398357 -0.20314407 0.017282596 -0.1559859 -0.033998825 -0.12108744 0.21495797 0.09999517 0.17993583 -0.19040354 0.118755855 0.09877137 -0.17171803 0.07752846 0.23186192 0.15812632 0.22833392 0.2301421 -0.18006037 -0.09088199 0.06528301 0.21891811 -0.09874995 -0.037270967 0.09899263 -0.11478156 -0.21686646 -0.20699672 0.09249816 -0.12416425 0.07448703 -0.072465256 -0.033629637 0.21095906 0.06327854 0.17813794 0.007428798 0.15447372 0.13632128 0.15077761 -0.22870314 0.08676597 -0.09558792 -0.010076093 -0.043721225 0.05014245 0.13259651 0.22381216 0.052372646 0.14844358 -0.07507264 0.20923568 -0.20598398 -0.038061477 0.1887795 0.22080933 0.019046443 -0.13943335 -0.18239993 -0.22157472 -0.1542858 0.099143915 -0.20268792 -0.0065543843 -0.08440228; 0.048042476 -0.008182219 0.031957023 0.10901877 0.15211229 -0.16477431 0.2147255 0.21112415 0.22102821 0.0894899 -0.06197495 0.093648955 -0.038598336 -0.10424158 -0.13848536 -0.038362782 0.0767961 0.02590356 -0.12078689 0.1590764 -0.20676915 0.116574965 0.019915402 -0.10171486 -0.110356435 0.05931893 0.22475572 -0.19500375 -0.17814901 -0.10063739 0.11770336 -0.024404213 -0.06979643 -0.08687743 -0.03006371 -0.2158246 0.15376315 -0.23212822 0.06690466 -0.0024197132 -0.13000616 0.06910619 -0.18121973 -0.17963336 0.12286933 -0.038595874 -0.12350134 0.23105243 -0.0061150794 -0.122426726 -0.055964977 -0.06448522 0.12830158 0.051024836 0.06159603 0.22367766 -0.02532265 -0.025688201 -0.19689977 0.19040924 -0.1687808 0.16387036 -0.12987572 -0.15925108 0.022287149 0.14791556 -0.086462624 -0.051946796 0.05203073 -0.0059856405 0.18481055 -0.06974049 -0.16088964 -0.111467324 -0.18403648 0.09457092 -0.0032968961 -0.13551034 0.07022225 0.18721272 0.20621276 0.1107749 -0.024540449 0.17249148 0.038311515 0.19588062 -0.05608678 0.10551919 0.037851065 0.122617975 0.15659322 0.20106381 -0.18641606 0.18959431 -0.13710888 0.14790493 0.1420303 -0.0051251547 -0.19617312; 0.06287652 -0.091589764 -0.20875742 0.119152345 0.08218492 0.085991345 0.013900283 -0.17324291 -0.11553473 0.020469937 -0.123820126 0.08246508 0.2296697 -0.028186142 -0.09463849 0.056287985 -0.17099983 -0.2132552 0.19881918 -0.036806073 -0.2097018 0.05804364 -0.18145455 0.05100484 0.12503864 -0.027423771 0.17033084 0.032685693 0.17350551 -0.05927664 0.055994175 -0.1053876 0.18780035 -0.12201936 0.15640949 0.13683754 0.23196812 0.2018835 -0.026138581 -0.004622921 -0.12099369 0.21892871 -0.21181853 -0.14488068 0.20230475 0.1960437 -0.008138616 -0.06859235 0.22961096 -0.1803993 -0.052675296 0.15749711 -0.22254986 -0.20150097 -0.06544257 -0.13079989 0.034714933 0.059647255 0.16033289 0.07805741 0.10155731 0.20323153 -0.003165499 0.06435005 -0.17410652 -0.0751257 0.208176 0.14689171 -0.16981953 0.1695395 0.10802563 -0.06401586 -0.10444284 -0.14937246 -0.093442716 -0.16616993 -0.097568944 -0.22747774 0.10334708 -0.050102286 0.11184666 -0.21624875 -0.10644993 0.06776861 0.097526036 0.104886174 -0.1904204 -0.19487101 -0.02296869 0.026090112 0.16730474 -0.21505018 0.15530744 0.018647637 -0.13352403 -0.15301898 0.0005234345 -0.16558854 0.022711182; 0.018186852 -0.09998818 0.049838066 -0.06979335 0.018063035 -0.20910643 -0.13050406 -0.11784232 0.18166547 -0.12959653 -0.12468783 0.1634465 -0.020411903 -0.010607834 0.11902159 0.14585598 -0.22027275 -0.18647149 0.17888339 0.07355075 0.1728698 -0.11158144 -0.23077615 -0.13964653 -0.10580623 0.04498513 0.19867301 -0.14812617 0.11329136 -0.20215093 -0.18386964 0.19010796 0.2035005 -0.11150989 0.033492897 0.062279947 -0.1597437 -0.05698203 0.1874308 -0.083952904 0.028973015 0.13019146 0.081338726 -0.22492683 -0.0028932516 -0.057732124 0.036615495 0.15063818 0.06867127 0.050550066 0.10202005 0.011051138 0.069320455 0.08151787 -0.12329255 -0.028399935 -0.08443041 -0.15836942 0.123146355 0.2116907 0.04222635 -0.23325068 -0.047259714 -0.19144051 0.11559536 -0.13225879 0.1539774 -0.1721837 -0.20514669 0.082762554 -0.074212104 0.19776079 0.13212599 -0.11443738 -0.065302275 0.17254363 -0.040868666 0.2330871 0.19046398 0.15018117 0.093955636 0.2126063 0.041318685 0.10189651 -0.10086173 0.05979146 -0.06533799 -0.034146555 -0.22563833 0.1513796 -0.11699106 -0.09873255 -0.02650816 -0.002650455 0.11079456 0.16929153 0.17710651 -0.14541146 0.19325817; -0.0060510593 -0.02811622 -0.07473688 -0.1951872 -0.024415651 -0.18838008 -0.031265974 0.18147928 0.14128944 -0.019957636 0.10326186 0.15117013 -0.1535109 -0.12852147 0.15401468 0.00908393 -0.12743919 0.06256708 0.023665868 0.061524596 -0.07335981 0.21892545 -0.18140183 0.024667231 -0.104644634 -0.046874724 -0.16164549 -0.064083315 -0.012662499 0.031680804 0.030010011 -0.11621792 0.21129088 0.15582947 -0.06104622 0.08204278 -0.090648614 0.0320689 0.13875757 0.1970628 -0.08322247 -0.08364883 -0.08078226 -0.003279947 0.20049787 -0.038091067 0.23390429 0.091263115 -0.054305732 0.17768787 0.105373725 0.07544832 -0.009899974 0.22188103 0.060599335 0.14951487 -0.2046911 0.22522002 -0.11938691 -0.019126544 0.0025206802 -0.2149898 -0.1504207 0.19926596 0.22343414 -0.20948896 -0.20656747 -0.011086099 -0.11886413 0.15842414 -0.13463786 -0.18622778 -0.15373501 -0.096191674 0.119256414 -0.22117642 -0.013849967 -0.11876204 0.2197821 -0.02460363 0.19622497 -0.013054789 -0.12043224 0.07382459 0.20364752 0.07598979 -0.17354763 0.004281255 -0.030207554 -0.22246273 -0.0662677 -0.06901479 0.061339445 0.13984832 0.004679725 0.20221853 0.041941687 0.12074057 0.056234345; -0.16669913 -0.01743069 -0.14625864 -0.16003066 0.23303013 -0.00586269 0.18765122 -0.06958496 -0.17096578 -0.19252336 0.14970352 -0.038734656 0.010202232 0.21192464 -0.09850153 0.21976095 0.16594459 0.18475266 -0.19792043 0.06256042 -0.11703279 -0.012277845 -0.13324524 -0.09877814 0.1922108 -0.1777355 0.22417262 -0.0175469 -0.2147324 -0.13427201 -0.053428102 0.21191214 0.040349625 0.13860646 -0.07468902 0.14227282 -0.12156965 -0.06443877 -0.093755685 -0.085930966 0.10768525 0.08878624 -0.21553957 -0.12831263 0.119905345 -0.12671192 0.12433106 -0.085863054 -0.12778835 0.16462977 -0.15022899 0.034011185 -0.12967359 -0.04940156 -0.22239818 0.22160429 0.14075616 -0.12926574 0.14031325 -0.12578015 0.15402925 0.02221471 0.10313298 -0.17691663 0.06565443 0.12118653 -0.2177742 0.1233458 -0.052539926 -0.10624436 -0.17581953 -0.13164826 0.06994377 -0.10316195 -0.1026312 -0.16469027 -0.018158156 -0.11269389 -0.026726538 -0.13912657 0.23092645 0.16393992 -0.21044189 0.110553585 0.111070335 0.05329293 -0.22327858 -0.18183358 -0.021706155 -0.14601468 0.051987264 0.17622009 0.11393604 -0.17459646 0.102338925 0.11333807 0.2231078 0.053271506 -0.019361285; -0.20664483 -0.05821162 0.22916652 0.21327056 -0.07102652 -0.023487454 -0.11737373 0.07338199 0.15409738 -0.024102626 -0.22450285 -0.15195358 -0.069354996 -0.09231941 -0.14146757 0.11988138 -0.10336607 -0.14247814 -0.06811439 -0.07305257 0.11392907 -0.15359475 -0.13595007 -0.032307107 -0.14206238 0.07447929 -0.22935039 -0.0075826817 -0.19861633 -0.09851859 0.02865722 -0.038407028 -0.23025061 -0.09736539 0.13657671 -0.18266532 0.06976211 0.041736815 -0.06986039 -0.041945826 -0.21253912 -0.07250072 0.12171187 -0.19086449 -0.18540438 0.018375836 0.1409952 0.18925732 -0.14101535 0.10703873 0.19966272 0.12900661 -0.09043166 -0.014839444 -0.13033964 0.1404546 0.22735286 0.14730349 0.17858194 -0.21727386 -0.19258954 -0.09334348 0.1103321 0.20560056 0.06740785 -0.07732594 0.22311912 -0.017162554 -0.05112846 0.17635809 0.06619769 0.1996586 0.10379304 0.090217195 -0.018690877 0.15473498 0.0850573 -0.17859894 0.06629707 -0.032375213 0.0900632 0.14904949 0.18618084 -0.13662417 0.020388857 -0.20294535 0.098324515 -0.17324084 -0.11966342 0.04720635 0.19722603 0.053241048 -0.21713634 0.2177596 0.17049202 0.15343703 0.067914724 0.05592073 -0.072842866], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.67499644 0.34253561; -0.27180654 -0.14506549; -0.6092826 0.06320071; 0.5393801 0.6643296; -0.61429715 -0.066203594; -0.417104 0.6038573; -0.4244111 -0.51997316; 0.6622757 -0.58499175; 0.07892217 0.40343982; -0.6830543 0.28614566], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.23051004 -0.17088094 0.12780623 0.060316898 0.2628151 -0.12407083 -0.27890813 0.16999331 -0.25054052 0.26369593 0.19679976 0.15435652 -0.095946044 0.007997241 -0.26547682 0.12946343 0.23091817 -0.014845921 0.23629211 0.1605042 0.008404895 0.15925677 0.04409583 -0.1494514 0.17018822 0.11004049 0.17648003 0.08663305 -0.21215354 -0.09569194 -0.24099883 0.07636487 -0.22343293 0.27387196 -0.0711237 -0.0706559 -0.08610523 0.1420447 -0.17187017 -0.22226478 0.06739562 -0.13134442 0.017451791 -0.26788372 0.25300866 -0.119625084 0.055896748 -0.057013594 -0.03203756 -0.14191401 -0.20704374 0.06286009 0.114731185 0.11252921 -0.27729332 0.13512866 -0.25226298 0.19671609 -0.090922065 0.2476492 0.1797741 -0.17036211 -0.15589274; -0.23699006 -0.21747088 -0.2464361 -0.25855923 0.23947646 0.1367627 -0.036736827 0.25025088 0.26282 0.15867442 -0.09284831 0.099788405 0.2219672 -0.1774813 0.16386212 -0.03130831 -0.14189322 -0.2772456 0.05284816 -0.15644933 -0.20473018 -0.10153724 -0.13057204 0.023975898 0.05231819 0.009338898 -0.01632589 0.18447872 0.21266478 -0.15856898 -0.13905172 0.21802475 -0.1490892 0.059792567 0.21422024 0.14568335 -0.023992714 -0.21004415 -0.029482136 0.18202704 -0.005674999 -0.24602732 -0.16916488 -0.26647982 0.26557478 0.09071557 0.20812808 -0.024367353 -0.23761986 0.048401352 -0.012097194 -0.27745774 -0.22576036 0.08541791 -0.080281906 -0.03715265 -0.07122971 -0.061473284 0.12272699 -0.046621453 0.16569845 0.12054695 0.21753335; -0.26825723 -0.087400235 0.26763391 0.033851806 0.049916144 -0.07565178 0.27646902 -0.11728179 0.0036576781 -0.21083303 -0.10900687 -0.029280769 0.20360558 0.24636942 -0.26552728 -0.17859301 -0.10460012 0.09153816 0.18033071 -0.28383416 0.051444847 0.20727 -0.003188541 -0.0001425491 0.10533921 -0.1476896 0.22942358 0.06973522 -0.27958784 0.15821636 -0.071664296 0.28208953 -0.0071036345 0.14963678 0.18294829 0.2752055 -0.0043829666 0.13395658 0.25669304 0.0077320333 -0.02368663 -0.11218211 -0.265979 -0.06262239 -0.22973636 0.27592456 -0.019462686 -0.12534681 -0.17847729 -0.1251616 -0.0039728857 -0.19082725 -0.028449329 -0.054254923 -0.07127572 0.03545553 -0.25531128 0.16201736 -0.1362618 -0.043973852 0.014853507 -0.06149943 -0.22586173; 0.061847825 -0.13845496 0.20417444 -0.15874074 -0.0003535874 0.052026834 -0.17116392 -0.2615502 -0.08705919 -0.2849176 -0.119349144 0.15504183 -0.1554014 0.10331006 0.27896586 0.17785293 0.06777231 0.13334784 -0.21490246 -0.22283874 0.046004094 0.103922024 0.061059512 -0.08228699 0.08090302 0.24903056 -0.18192342 0.014332217 0.027824998 -0.111030474 0.20168887 -0.11055013 0.24608108 0.18956082 0.06750816 -0.20508237 -0.07240869 0.07220538 0.08709699 0.25072974 -0.12519304 0.24175976 0.21709463 0.22240466 -0.26974398 -0.24805854 0.1811092 -0.23527345 0.0017193042 0.16158873 0.08117937 0.26582742 -0.2636927 -0.0050638597 -0.104387365 0.20588134 -0.10426403 0.21419929 -0.10146926 -0.19915587 -0.010994873 0.23576717 0.15361883; 0.04984899 0.062893756 -0.1805884 -0.118505746 0.21270001 0.16275027 0.020117743 -0.26641718 0.15523584 0.09455394 0.28069973 0.17378284 -0.009010464 -0.1647194 0.0033889846 -0.18095161 0.2806984 0.267946 -0.1895694 -0.05898758 0.26065257 -0.13867345 -0.26624927 0.15798303 0.24765886 -0.20828623 0.095861144 -0.07331532 -0.0953983 0.28066158 0.2208571 -0.006417307 0.1744272 0.24749638 -0.05355366 0.25807834 0.16090308 -0.07917771 0.14552937 -0.04344785 0.157132 0.26334777 -0.21789056 0.100472204 -0.0982164 -0.0039579165 -0.019019933 0.26600963 -0.1712604 -0.28245276 -0.08499013 0.017382378 0.25956112 -0.059220422 -0.12963665 -0.27180663 -0.19083132 -0.18127677 0.27978405 -0.23209083 0.23888855 -0.13325351 -0.14562087; 0.21655205 -0.012329216 0.16605651 -0.24025342 -0.048223566 -0.17943785 -0.22379468 -0.13491099 -0.09845297 0.08189502 -0.25715432 0.11359585 0.19697118 -0.20728073 -0.26475036 0.24065547 0.09218857 0.11682062 -0.037545677 0.14765108 0.15853962 -0.12332497 0.03384863 -0.12608911 -0.23898363 -0.009616853 -0.26246879 0.17767292 -0.26885548 -0.080085054 0.15293205 0.24521324 0.07589368 -0.22135897 -0.023569679 -0.15336479 0.08831079 0.17023982 0.040640917 0.1459804 0.17528616 0.19991957 0.006803157 -0.20926294 0.24284492 0.21002805 0.080541715 -0.011537968 -0.03166422 -0.023291929 -0.13876224 0.13998808 0.13815106 0.24311619 -0.09883294 -0.036933787 0.10318372 0.124747075 0.16030824 0.0803818 0.0043968763 -0.17637347 -0.09239376; 0.18829979 0.17668591 0.04285328 -0.21735704 0.20939076 0.23240614 -0.0066262265 0.19664425 0.23675984 -0.16480091 0.101442404 -0.19622365 0.04974841 0.12663737 -0.12782629 -0.02583888 0.14200458 -0.26475668 0.13808319 0.0840062 -0.021739474 -0.2476591 0.053587187 -0.18487614 -0.04781383 -0.011118591 -0.021759843 -0.2701421 0.26908672 -0.17070924 0.05579152 0.09870768 0.12696782 -0.17425194 0.1445448 0.10189059 0.116836816 0.2074572 -0.2518963 -0.018348062 0.046255562 -0.12508102 -0.18293706 0.2252345 0.08814429 0.23858695 -0.102691546 -0.27797434 -0.16377385 -0.20959978 0.19958547 0.28363705 -0.043296855 0.0033721356 0.07070507 -0.053595662 -0.19026689 -0.12041367 -0.10377944 -0.1300637 0.0106913885 0.20993173 -0.011673853; -0.1005854 0.022222076 -0.20027284 0.08475465 -0.0013654776 -0.17350824 -0.032671187 -0.08005282 -0.07109359 -0.24576351 0.2617431 0.17758222 0.24515042 -0.003781738 0.27737722 0.002457306 -0.13030455 0.22185118 0.09037173 -0.1431843 0.0774434 0.27426547 -0.16522042 0.28576988 0.21457537 0.08830034 0.17538086 -0.2670287 -0.16906717 0.09718486 -0.22459967 -0.0146215195 0.0015256958 0.041694265 -0.1629552 0.22953062 -0.10614997 -0.12037081 -0.06418104 -0.2225658 -0.042160906 -0.010112067 0.2464024 0.09669597 0.21655577 -0.057800263 -0.23312312 0.2290664 0.0038743215 -0.037651934 -0.08370602 -0.11597083 0.17334063 -0.21027993 -0.03269289 -0.014667383 0.1161807 0.17260127 -0.08418832 -0.02805162 0.0910587 -0.19292666 -0.13742298; 0.103297144 0.20838058 0.07808451 -0.055319376 -0.1393753 -0.1301206 0.2856093 -0.13748805 0.047006518 -0.016344992 -0.25730497 -0.002099891 0.26130486 -0.17825049 0.15934567 -0.2594467 -0.21957402 0.0918249 -0.030839888 0.21615478 -0.057170738 -0.0031236743 0.037601933 0.21432143 -0.19019066 0.004744893 -0.24191281 0.05708803 0.053049456 0.13537031 -0.14410996 -0.24476522 0.11247699 0.061793722 -0.13064271 -0.18640576 -0.11965687 -0.15879461 0.2155596 -0.056149926 -0.084812276 -0.23595701 -0.07765441 -0.27943686 0.081153095 -0.14262046 -0.06420557 -0.008680663 -0.19088614 0.050029714 -0.0024268208 -0.24232696 -0.25448886 0.030460157 0.1293445 -0.2034683 -0.09377776 -0.11447541 -0.27084008 0.24768466 0.26519248 0.21911234 0.2848391; 0.13831292 -0.09032654 0.24129306 0.24591751 -0.1844175 -0.22051547 -0.23799565 0.20073436 0.019648537 0.21984743 -0.14400277 0.16183989 0.11759836 0.20328332 0.107287906 0.2124768 -0.185407 -0.27066985 -0.2362037 0.27495778 -0.20298645 0.20601039 -0.22097042 -0.045490354 0.06705823 -0.27575323 0.031520408 -0.10701573 -0.27298558 0.024281023 -0.16757856 -0.13419783 -0.20551617 -0.071223155 0.076067224 -0.2050335 -0.2260996 0.15651889 0.031992413 -0.10746856 -0.01973032 -0.23726863 0.274604 0.08307732 0.10730722 -0.20218244 0.15770689 -0.1476008 0.19412543 -0.23853736 -0.21048628 0.2820913 0.004421722 0.15349703 -0.05630437 0.108745456 -0.12121968 -0.06340404 0.060741607 -0.21585797 0.05963621 0.21314786 -0.104219906], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.3229558 -0.30737773 -0.18375857; -0.27691472 -0.3213116 0.60526323; 0.24539484 0.14705099 -0.47891378; 0.27015355 0.44633394 -0.654428; 0.48623878 0.29607806 0.41522643; -0.21722521 -0.121872365 -0.045540318; -0.41786978 -0.35949442 0.42454696; -0.49580407 -0.4200282 -0.6166244; -0.653852 -0.4924457 -0.59517473; -0.23072563 -0.44474968 0.3801429], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.19167826 -0.2005456 0.093027115 0.2919471 -0.4525514 0.14941604 0.27724198; 0.24705055 -0.3899697 -0.2383769 -0.2940351 0.1265744 -0.380112 0.047773212; -0.27201015 0.23432729 -0.12393922 -0.38608494 -0.15593751 0.20254296 -0.36166158; 0.23978588 -0.34707046 0.5440005 -0.07208878 0.09178053 0.2319941 -0.1512891; -0.013525512 0.15534721 -0.33925053 -0.21241198 -0.34342572 -0.24226199 -0.04107016; 0.03805241 0.4868758 0.57239217 -0.5742062 -0.56243646 -0.24671458 0.26510122; -0.24717045 -0.37709856 -0.14926888 -0.44597688 0.27541885 -0.5300164 -0.50972503; -0.52920604 0.55722284 -0.3573865 0.14160769 0.17889665 -0.10718824 -0.45319906; 0.0732717 0.51757675 -0.34393924 0.5124288 0.4948161 -0.56787896 -0.28333378; -0.15275908 -0.5922906 -0.0025829081 0.1296202 -0.274112 -0.33697692 -0.35024434], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.4336404 0.39134496 -0.47326693 -0.056723908 0.14990032 -0.44278485 -0.21365114; 0.44550282 0.5747059 0.54858875 0.20945968 0.31732023 -0.42292604 -0.59062135; 0.23441496 0.02329157 -0.04524165 -0.26926288 -0.5357172 -0.10216853 -0.3281726; 0.12209483 -0.3002582 -0.24107885 0.550901 -0.0928342 -0.2619169 -0.18388243; -0.19647141 0.38184872 -0.25967756 0.4450424 -0.06496144 0.04599901 -0.30084622; 0.21024898 0.51694727 -0.32888138 -0.56666654 0.40065685 -0.47655708 -0.08426565; 0.24230002 -0.447869 -0.057576947 -0.07352177 -0.5096584 -0.102979645 0.3625926; 0.31073764 0.29492503 -0.35938832 0.49078852 0.06726326 -0.40312347 0.3283037; -0.19401811 -0.10637196 0.17188524 -0.091331236 -0.4432677 -0.25047147 -0.07945472; -0.52960104 0.10810431 -0.42134774 0.45309126 0.092543975 0.5049 -0.56001407], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.40298754 -0.19131061 0.1016504 -0.13350923; -0.6545591 -0.14159667 0.06485487 -0.560009; 0.46344277 -0.257973 -0.11357925 -0.4621086; 0.4608856 -0.5846482 -0.31752202 0.61994445; -0.07243919 0.34825924 0.2866713 0.20086329; -0.3833766 -0.3519498 0.5105051 0.5229394; -0.13438524 -0.08585971 -0.60809994 -0.18082896; 0.2530726 -0.20759954 -0.2402351 0.14625213; -0.028829832 -0.1468418 -0.2570717 -0.12472808; -0.16986111 -0.17584395 0.27537054 0.08528104], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.3391974 -0.030197544 -0.13834168 -0.025594246 0.20504504 -0.119572096 -0.092744984 -0.20609984 0.2434408 0.20105207 0.01766493 0.17305504 0.033617217 0.29952034 0.15471847 -0.1451842 -0.38168097 -0.11220869 -0.19833086 0.07361042 0.029660486 -0.07903857 -0.054884598 -0.26532772 0.34807697 -0.009103636 0.15453467 -0.13079508 -0.2831765; -0.0947191 -0.33980286 0.023089388 0.3842136 -0.14234783 0.115454935 -0.14814551 -0.32829043 -0.30926985 0.33693784 0.063427 -0.16452044 0.065494254 -0.24742185 0.33155224 -0.1252584 -0.28311774 0.03216511 0.03439134 0.37036404 -0.28994063 -0.34616253 0.17804353 0.020124761 -0.18412974 0.24945235 -0.1694552 -0.2795104 0.38988274; -0.08838347 -0.37396023 -0.27726507 0.046963185 0.06752148 -0.35535243 0.20054474 0.2926541 0.38855305 -0.09739074 0.34239364 -0.0065396763 -0.36037084 0.077351265 0.28652474 0.27688044 -0.060699902 -0.051275276 -0.28163004 0.22591536 0.030389765 -0.010662773 -0.05885503 -0.20731784 0.11167387 0.36620718 -0.13022916 0.3661918 0.35081175; -0.11383066 0.3904362 -0.07801452 -0.2843936 -0.3411165 0.29705444 0.2931338 -0.03674919 -0.13581409 0.087294064 0.2588187 0.040941536 0.009561535 0.044024087 -0.05602025 0.27694455 -0.16537526 -0.06418013 0.04785822 0.06343865 0.13830122 0.19087759 0.18501389 0.028689982 -0.36142468 -0.10538061 -0.056586485 0.25180778 0.014516685; 0.052105132 -0.33981007 -0.00037139666 -0.09604346 -0.22028661 0.32238254 -0.16724946 -0.003428604 -0.019285506 -0.12781371 -0.20717172 0.11914926 -0.040502995 -0.059667446 0.23763798 0.1255583 -0.37751275 0.26869065 -0.3738089 -0.30848244 -0.37378514 -0.25116858 0.22732164 -0.3713533 0.32673198 0.10275717 -0.14351103 0.20194851 -0.016374838; -0.13864289 -0.18477982 -0.024768364 -0.27498746 0.0023075407 -0.36147392 -0.11238454 0.37084544 0.19652653 -0.3638814 0.19958033 -0.3150801 -0.36249408 0.110299475 0.25173324 0.20295398 -0.36562768 0.2903716 -0.13646674 0.21410939 -0.3235486 0.29551083 -0.20739976 0.028016297 -0.221491 -0.31095344 0.12617096 -0.37298492 -0.0397206; 0.2018266 -0.12543383 0.0037681586 0.09182853 -0.028916383 -0.34087643 -0.1967454 -0.2799707 0.18160173 -0.24320191 -0.18050027 -0.29124436 -0.3769101 0.19332138 -0.21888669 0.30859578 0.04526167 -0.27593556 -0.23429592 -0.03733745 -0.05266604 -0.15124442 0.3068688 -0.2845108 -0.18918855 0.37084964 0.060032625 -0.16881776 -0.25421536; 0.34050223 -0.08569696 0.2314965 0.24852464 -0.05376419 0.30885768 -0.03589301 0.26827416 0.025004912 -0.050241087 0.04188553 0.3084969 -0.029199734 0.37600458 0.02569501 0.07947734 -0.2888289 -0.06665385 -0.0874413 -0.1052877 0.22120063 0.015729487 -0.08927837 0.23962869 0.086190164 0.07535892 0.29266477 -0.23879836 0.3890635; 0.26427835 -0.38018537 -0.02523828 0.13411866 -0.32045907 -0.27189255 -0.24801044 -0.3867552 -0.18594141 0.3737637 0.3382588 -0.14686126 -0.15666182 -0.13576832 0.0025671865 0.16302176 -0.2473425 -0.1909632 -0.07821203 -0.3607602 0.14817543 0.32762712 -0.05296122 -0.006121522 -0.24894367 0.20897858 0.046502806 0.11690999 -0.26370883; -0.1791839 -0.3364985 0.3355532 0.06941826 -0.03768888 0.06902577 0.23592754 0.3209204 0.020844877 -0.13871892 -0.26823258 -0.38084528 -0.19671538 0.24705672 0.24816792 0.22618598 0.26679146 0.076907866 -0.25178772 0.08146917 -0.30396792 0.2552407 0.064591594 -0.23718667 -0.19750956 -0.2271961 -0.28695294 -0.3900727 0.36231762], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.2657773 -0.33315572 0.34774923 0.17114742 0.03491177 -0.1051171 0.2065891 0.29601514 0.020180393 0.08397581 -0.3468201 -0.21994883 0.3772128 0.19392519 0.08045101 -0.33723077 -0.30033654 -0.24810834 -0.02121704 -0.16821337 0.31366858 0.069118425 -0.009303932 -0.11427391 0.012353316 -0.029092321 0.38242576 -0.014049612 0.30474505 -0.11257237 -0.11276359; 0.35310167 -0.23830447 -0.32306924 0.36533415 0.30578592 0.25914893 -0.110355385 0.029365528 -0.040632945 -0.06858993 0.049757656 -0.086929105 0.1332337 0.103301644 -0.07345546 -0.30588192 0.32670072 0.38215548 0.31753597 0.35056838 -0.086903974 0.24868235 0.010170162 -0.06996112 -0.33586085 0.19830057 -0.36825073 -0.30402204 0.14768776 -0.16353276 -0.2812391; -0.32763544 0.17450327 -0.15593922 0.03266833 -0.33007437 -0.21157251 0.05429858 -0.30412045 0.24322832 0.24342094 -0.25131574 -0.3015809 -0.3108706 0.21288972 0.3608225 -0.25445566 -0.33486393 -0.07092791 0.3534975 -0.20660006 0.33224162 -0.19576508 0.060784243 0.037399825 -0.35599282 0.08223404 -0.011937736 0.22720934 -0.1369456 -0.14955051 -0.26962677; 0.2604871 0.15371087 0.05895515 0.11981733 0.29605004 0.25058997 0.26305637 -0.28466457 -0.01725318 0.36188674 -0.08623457 -0.3636396 0.091138355 -0.17194133 0.089896314 -0.29315913 -0.07586184 0.21175899 -0.13717394 -0.273673 0.20284623 0.029248284 0.04678124 -0.28521773 -0.05230144 -0.35816726 0.2566447 0.36683044 -0.3567801 0.17073417 0.29793414; -0.16053551 -0.26393634 -0.28212953 -0.25751072 0.18662387 0.026052924 0.085132435 0.36732185 0.012940455 -0.24887338 -0.14627777 0.008574374 0.13516891 -0.0871096 -0.23850124 0.16552918 0.18848415 -0.28865784 0.04006838 -0.16579783 0.25347847 -0.29356056 -0.013343814 0.32646942 -0.3421506 0.1810457 -0.24780472 -0.31004637 0.02469058 0.11147434 0.2935835; 0.30246964 -0.17502451 -0.28407404 0.3270066 0.26710087 -0.20878315 0.37794688 -0.24404995 0.36772656 0.18393794 0.03831011 -0.14440818 0.35762838 -0.14265008 0.15251558 0.15177959 -0.1911454 0.010686524 -0.16544814 0.06358395 0.31152818 0.33484632 0.043292332 -0.33874944 0.2201297 -0.26483718 0.07783636 0.11590651 -0.03501593 -0.305713 -0.15841551; 0.037324626 0.11669097 0.079811566 0.34500676 0.18243514 -0.045683663 0.25144878 -0.38208646 -0.07928179 -0.15854444 0.121312976 0.06768503 0.27097705 -0.14808948 -0.11551491 -0.19908813 0.24991782 0.20605934 -0.32696497 0.093972586 -0.26626304 -0.33380598 0.27124432 -0.03902416 0.31892347 -0.045706466 -0.19136864 0.35799032 0.2848882 -0.19392915 -0.2109311; 0.3283317 -0.009006919 -0.08608517 -0.37188822 -0.16045927 -0.3014493 0.10584483 0.17319523 0.17420201 0.22661823 0.14186098 -0.007188772 0.037044078 0.20558187 0.19359133 0.24624231 -0.053090144 -0.14679906 -0.04233453 0.29085213 -0.21275003 -0.06059043 -0.12993993 0.33866477 -0.09085238 -0.19550014 0.18044092 -0.24103126 0.23630504 0.24420062 0.16675448; 0.21895751 0.3665454 -0.24613924 0.36984926 0.23847339 -0.31188506 0.17351504 -0.0054066963 0.102508694 0.20641956 -0.07574587 0.20106119 -0.0076464894 0.10791744 0.091783226 -0.34365404 0.37664077 -0.0845536 -0.13764073 0.06776698 0.13175 -0.3028246 0.16583933 0.37577766 -0.3747351 -0.17322834 -0.007870537 -0.23190126 0.049782876 0.32834643 -0.2716868; 0.29705966 0.059291247 -0.013541001 0.12251279 0.048467137 -0.24149166 0.2134301 0.10317058 0.23206177 0.09624439 -0.17112608 0.2584991 -0.10523768 0.30830225 0.16027562 0.16110837 -0.3739258 -0.0828869 0.077088654 -0.047486626 0.16037157 -0.095143534 -0.19501856 -0.013173122 -0.109111786 0.08926412 0.15381868 0.10776007 0.25870386 0.3431325 -0.25567102], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.18387623 0.29213774 0.27113417 0.34511113 -0.39515603 0.18484809 -0.07040607 0.16631961 0.40850317 -0.29641977 0.03695208 0.11868787 0.14160717 0.10393915 -0.05730854 -0.32775 -0.3207192 -0.43574637 -0.12396111 -0.111158535 -0.28967613; 0.32983634 0.43082199 0.084580615 0.04297666 -0.073779866 0.09438166 -0.43158683 -0.08498586 -0.057174385 0.3147801 0.21391594 -0.26730362 0.39278907 -0.23400393 -0.3798965 0.25125837 -0.42503124 0.15902989 -0.2849139 0.09212458 -0.223001; 0.08021913 -0.019811083 -0.2602453 -0.12371599 -0.18101226 -0.36277556 -0.017248876 0.37101144 0.19223398 0.047413144 -0.12380436 -0.024725974 -0.17465606 -0.18239266 -0.21481873 -0.24410291 0.05303332 -0.0064423033 -0.273076 0.098525085 -0.38026434; -0.0075370944 -0.3478771 -0.06565848 -0.32314202 0.0038847656 0.15385813 0.3657084 0.16283946 0.3412721 -0.25935102 -0.14418924 -0.3475245 -0.05128779 -0.36149713 -0.40454853 -0.42683554 -0.1889172 -0.31020886 0.35538805 -0.24256875 -0.32673028; -0.20561598 0.29853076 0.068223566 -0.33201852 -0.21337087 -0.11822483 0.01813567 -0.091204174 -0.25782225 0.4049386 -0.03381623 -0.14555565 -0.34109145 -0.1918527 -0.40207788 0.40998214 -0.01082084 0.16432679 0.017044656 -0.37699187 -0.37615085; -0.27515462 0.4262299 -0.35075137 -0.30304873 -0.29709283 0.39324084 0.16930373 -0.16878709 0.2310057 0.2809926 0.21465547 0.020793745 -0.23319444 -0.43228814 0.42361534 -0.013841572 -0.22592513 -0.23437074 0.31902263 -0.031814557 -0.022642225; -0.2073699 0.19160412 -0.34361827 -0.35329288 0.34901232 -0.11455477 -0.013441678 -0.20550165 0.32175392 -0.34809703 0.07340997 0.267614 0.09723253 -0.22180751 0.261098 0.00462765 -0.15797831 -0.26886854 -0.3067364 0.059434455 0.2364871; 0.34576276 0.37775347 -0.26761538 -0.20161085 -0.062385697 0.30365202 -0.20844942 0.11529157 -0.14746617 0.29925156 -0.11010444 -0.17382786 0.2818807 0.052629806 0.35695988 0.025996929 0.027483327 0.43765044 0.22348465 -0.43593228 0.41265625; 0.19411382 0.4192722 0.2217773 0.27639332 0.16110693 0.011401354 0.13030335 -0.406191 -0.28482696 -0.2877185 0.3397525 0.08338529 0.2497212 -0.12641318 -0.35954696 0.21716896 -0.38097602 -0.27852175 -0.38840422 0.31897685 -0.22469871; -0.027911646 0.33953044 -0.43197802 -0.3277015 0.068906926 0.28759927 -0.16228594 -0.20004211 -0.15567702 0.1946773 0.0022238293 0.003362832 -0.41944867 0.023102693 -0.06605397 -0.115440674 0.40675545 0.39955837 -0.239221 -0.12761727 0.04178301], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.24117164 0.23081018 0.23020047 0.34507298 -0.013347041 -0.020850815 0.15353546 -0.3070773 0.3463223 -0.09651567 0.08607916 0.18823442 0.14351355 0.076092504 0.03769828 0.0763608 -0.33506668 -0.14983098 0.2127508 -0.267094 -0.25533932 -0.25305527 0.24977016 0.08182135 0.3334334 -0.36157438 -0.1296597 0.19180924 -0.15481573; -0.24659802 0.2042754 0.24524958 0.08975254 0.018512273 -0.03955587 -0.3161264 -0.100043215 0.049691122 -0.009771617 -0.03385699 -0.31981054 -0.28444898 0.09005076 0.0709607 -0.32778496 0.20599829 0.15972501 0.09664173 -0.09253009 0.35565388 0.03069687 -0.3896988 -0.2382102 0.08933074 0.09807514 -0.36225003 -0.17342053 0.27090457; -0.10701245 0.29205602 0.022500006 -0.1967973 -0.21218185 0.2770404 -0.075696744 0.11257232 -0.21806908 -0.052632418 -0.3669497 0.053282678 0.1698928 -0.30094364 0.030736988 0.33984885 0.11816216 -0.20094383 0.2934299 -0.1930088 -0.27835616 0.27422175 -0.29868516 -0.25953576 -0.11336538 -0.1668163 0.10884592 -0.102739215 -0.35136923; 0.09995185 0.3740238 0.09093331 0.24295916 0.28177685 -0.08076122 -0.1785638 -0.027987167 0.21452989 -0.32368898 -0.3346672 -0.19106938 -0.060887028 0.24925199 0.31332546 -0.17298199 0.053892307 -0.02549423 0.09734973 -0.17406294 -0.359173 0.3095548 -0.24649371 0.022688674 0.23143922 -0.20671925 0.26358718 0.27954292 -0.3909742; 0.044509012 0.32798532 -0.10594137 -0.25663814 -0.21241653 -0.15894112 0.10908971 0.2244725 0.071865276 -0.35063177 -0.143507 -0.32612067 -0.074408054 -0.18268703 0.32545558 -0.38092068 0.24808338 0.22928363 -0.03150293 0.19530012 0.28829437 0.13678478 -0.06768766 -0.2398067 0.28186354 -0.36230835 0.3754365 0.15221694 0.039020166; -0.24294293 -0.011803708 -0.36900663 0.22918418 0.2558495 -0.19832394 -0.07170527 0.018208068 0.34952796 0.19680403 -0.15465999 0.11854034 -0.03296116 0.10263107 -0.33809257 -0.18315545 -0.34227747 -0.041029114 0.33557522 -0.0610706 -0.25960845 0.15409061 -0.13662034 0.22102702 -0.25788474 0.14437193 0.12616195 0.012529061 0.23875464; 0.39099997 0.11529044 0.3225645 0.3234444 -0.2461182 -0.09365882 -0.15244123 0.3249415 -0.0016508752 -0.024991024 -0.07935801 -0.33862093 -0.34540707 0.109269686 -0.21480848 0.08751467 -0.02304436 0.1370536 0.38683155 -0.24609445 0.0154467905 0.26219457 -0.0038068742 0.13044453 0.063598745 -0.12062798 -0.021214169 0.029900633 -0.071669966; -0.05993761 -0.029199688 0.10454084 -0.3482234 0.043384954 0.15863363 -0.13256915 0.10845203 0.072859764 0.32101506 0.11337852 -0.3482073 0.15855911 0.116973065 0.13766538 -0.08439237 0.22298312 -0.024832657 -0.0795472 0.2539572 -0.35659096 0.24951795 0.058599968 0.31251192 0.32299358 0.09341011 0.31361246 -0.32447138 -0.17206256; 0.15531501 -0.06817011 -0.3834824 0.27014923 -0.33172 0.22038512 0.03779161 -0.063341014 -0.24196798 0.14497899 0.1684538 -0.1878214 -0.013673644 0.033644572 -0.12545094 0.21755394 0.052252464 -0.16141647 -0.24646117 -0.38751802 -0.30511242 -0.009933212 0.15035051 -0.33847687 -0.25275245 -0.14473164 -0.1521468 -0.18188855 -0.28689748; -0.17784896 0.22801954 -0.043042127 0.19663496 -0.1373094 0.1300303 0.22328293 0.11200931 -0.15121333 -0.09825604 -0.22233291 -0.05199254 -0.2882172 0.032113444 -0.010349637 0.24332583 -0.25216818 0.122266084 -0.29580867 -0.102137536 -0.06862361 0.30719063 0.29821575 -0.3689593 -0.056586392 0.32337222 0.07050397 -0.35758072 0.019520137], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.32194614 0.02380971 -0.22501437 -0.36032766 0.15566674 -0.297637 -0.24393909 0.12982045 -0.35635382 0.34450707 -0.090162225 0.024492297 0.3748882 -0.078394495 -0.13791645 -0.24934126 -0.37850198 -0.23427603 -0.3193447 0.04026885 0.37654284 0.321495 0.3575722 -0.16428298 -0.07784489 -0.31074485 0.30270922 0.018100165 0.10779231 -0.21922015 0.22129102; 0.10493277 -0.19520043 0.12911953 0.36049256 0.27198997 -0.31387398 -0.07124371 0.052522477 0.20773008 -0.2741768 -0.36049625 -0.36611328 -0.057614923 0.27412373 0.16360623 0.32142434 -0.31787273 -0.2657675 -0.019763673 -0.1743152 -0.1238926 -0.041251868 0.13010657 0.37343383 -0.08701775 -0.36908376 -0.31991625 -0.03028849 -0.2908752 0.28516164 0.073432155; -0.11297377 0.15826078 0.33343196 0.04864198 -0.14663948 -0.16935787 -0.043439947 -0.16519709 0.3603967 0.11164508 -0.25024694 -0.020123389 0.2512338 0.09955868 -0.24670263 -0.09243854 0.01197513 -0.13748774 -0.026419345 0.25536433 0.22802523 -0.15958911 -0.34105316 0.37446436 0.28808743 0.2358513 0.08181764 -0.16074187 -0.16120136 -0.067048416 0.037819784; 0.032144625 0.3310267 -0.1044206 0.31924102 0.21809307 -0.35615262 0.17239207 -0.31631052 -0.23340306 0.27525377 0.18746756 -0.023316149 -0.093683094 0.31575397 -0.10961046 0.18323013 0.050893217 0.13099317 0.1658266 0.10254632 0.25599954 -0.058145516 -0.33983645 -0.15178533 0.1481312 -0.23345208 -0.25571594 0.18555246 0.29681158 0.21766795 -0.06981182; 0.20103945 0.054610185 -0.11852403 -0.29863337 0.33262604 -0.059005544 -0.20107971 -0.073055245 0.35409436 0.1426053 -0.09240845 0.353484 0.044165675 0.13793911 -0.35379267 0.12527679 0.3476704 -0.2674968 -0.37635267 -0.2281344 -0.19972762 -0.18844904 -0.09059144 -0.33873448 0.17717186 -0.0472992 -0.11254478 -0.17235243 0.020634234 0.3649633 0.13211049; 0.041391186 -0.21652496 -0.13840097 -0.27892095 -0.15146096 0.08327785 -0.3515314 0.17915331 0.32422578 0.16136822 -0.26458034 -0.35970098 0.27379677 -0.24814296 0.1577679 0.05014893 0.16742411 0.32470593 -0.02213088 0.37116402 0.31179786 0.27083164 0.20093451 0.11717897 -0.098262735 0.30570415 -0.060793594 0.08445386 -0.08221197 -0.2163695 -0.05867492; -0.19071993 -0.25969958 -0.28932112 -0.27476823 0.07497285 0.14347112 0.018381216 0.051128756 -0.19781297 0.34888896 -0.057928443 -0.1595192 0.26423037 0.122686446 0.25656196 0.05365033 -0.17111386 -0.1923092 -0.30673498 -0.060150314 -0.29093415 -0.06819501 -0.22598134 -0.3582016 0.34429833 -0.16961786 0.30163518 0.32822075 0.045092694 -0.029632352 -0.20482239; -0.042324636 0.23369387 0.356124 -0.3444401 -0.060261633 -0.034081794 -0.18290581 0.3055088 -0.006386751 -0.19471358 0.20091048 0.35772645 -0.3150698 -0.30110762 0.3421879 -0.18495794 0.05520024 0.29844028 0.12154573 0.11288836 0.20142962 0.1593029 0.37828234 -0.32862365 0.26551613 0.20596512 -0.21350138 0.26182693 -0.13261431 -0.10138354 0.345537; 0.0933797 -0.043870624 -0.07038263 -0.011624077 -0.051332466 0.338141 -0.18900959 -0.05455788 0.15821582 0.35535562 0.3815319 0.35870272 -0.095762186 -0.33861938 0.021429505 -0.07139908 0.3427599 0.2967601 0.19617824 0.3345417 -0.029888641 0.13151693 0.13143885 -0.13661033 0.24028906 -0.1282698 0.02262686 -0.26384613 -0.19027855 0.03009568 0.26859397; 0.14253348 -0.3622154 0.040966988 0.32630506 0.32700264 0.3415182 0.005561838 0.085153505 -0.12972039 0.12767929 0.1344928 -0.19511192 -0.36018446 -0.08069489 -0.29842204 -0.38022628 0.2014147 0.35982415 -0.3221851 -0.2566148 -0.058875117 -0.15206607 -0.15605715 -0.25619745 -0.0015178059 -0.047065575 -0.2553779 -0.17198548 -0.26722896 0.33595622 -0.30300632], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.2870426 0.40116444 0.08956081 0.039807662 0.15528621 0.042076964 0.3016175 -0.1706395 0.08583081 0.25361443 0.050941598 0.33829966 0.23029491 0.12505391 -0.013226443 -0.16689377 0.26096553 -0.40879592 -0.08371569 0.34470332 0.05118615; 0.24154848 0.052357353 0.23326404 0.19517285 -0.10904426 0.24734418 -0.40102175 0.32164198 0.43918875 0.33883545 -0.0886513 -0.31731114 0.4382465 -0.25380474 0.3660351 -0.15432762 0.42614797 -0.010314849 0.21810965 0.217508 0.087220915; -0.22326915 0.04425506 -0.40836433 -0.43354183 0.34074327 0.26369372 0.073627986 0.399125 -0.42903093 0.10642342 -0.036043417 0.2527251 0.1775803 -0.42815584 0.3848986 -0.1524463 -0.27865082 0.0055508413 -0.15734169 -0.12682776 -0.4063459; 0.017575873 0.009077984 0.16848086 -0.068094395 -0.110608384 0.1636105 -0.0862115 -0.3473899 -0.3675222 -0.32874778 -0.011497328 0.10598142 -0.2902803 0.11589443 0.42842644 0.37817323 0.13225237 0.19801663 -0.28509116 -0.17559457 0.20737262; -0.104861245 0.034512807 -0.22756127 0.16811648 0.121952206 0.37763312 -0.30585015 0.06619636 0.21012925 -0.08586426 -0.21863332 0.22820267 -0.039079934 -0.40450945 0.07584458 0.43638787 -0.39751196 0.36419064 0.34040204 0.0420467 0.14384703; 0.30760086 -0.23023166 -0.0025188331 -0.30647 0.2731295 -0.0035177022 0.10248223 0.41074973 -0.11181179 -0.3450227 -0.05425613 -0.046190914 0.31479812 -0.072312035 -0.23253694 0.33471644 0.20877501 0.22235881 0.43705592 0.24138306 0.10497882; 0.4261311 -0.21265295 0.092790425 0.34332165 0.16648139 0.26273593 0.19408335 -0.17035724 -0.12550284 0.26053402 0.16910155 0.28542808 0.17477119 -0.11355916 -0.13739923 -0.20497558 -0.014417314 0.21846953 -0.34067017 0.27891853 0.28570336; 0.055864412 -0.09827052 0.37853715 0.25717014 0.18522811 -0.35265937 -0.43370205 -0.4175773 0.16896121 -0.09451293 0.30405155 0.0011541068 -0.38567573 0.12817398 -0.42686704 0.3380473 -0.32028487 -0.08710165 0.1784918 -0.044962596 -0.10632556; -0.32414484 -0.07168473 -0.401091 0.35562286 -0.28146884 0.01334649 0.3276865 -0.42849994 -0.15900844 -0.06571297 0.35420704 0.12095549 0.17738421 0.24229744 0.2875925 -0.24264012 -0.39641744 0.09475701 0.18920009 0.11602303 -0.074785605; 0.07979547 0.40427086 0.013089666 0.27722982 0.34899658 0.18403493 -0.3163598 -0.21352533 -0.14277034 0.091853805 -0.057671722 0.3908306 0.31190494 0.43186623 0.11063335 0.12420714 -0.3689055 -0.16281392 -0.41669753 0.32737082 0.081703216], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], Float32[-0.18761852 -0.052535694 0.0045574512 0.22497292 -0.111436464 -0.08278927 -0.24199721 0.27164423 0.048021935 0.0039348053 -0.19183347 -0.21199492 -0.31030956 0.23886453 -0.016440919 -0.28719893 -0.11047198 -0.051914405 -0.27101493 -0.17248021 -0.09946369 -0.117937356 0.24533401 0.24761905 -0.24991402 -0.06544283 -0.11828575 0.25055712 -0.21721965 0.075461864 0.24004678 0.16537404 -0.12688383 0.19582221 0.08104981 -0.2314832 0.30425376 0.11037822 0.11140268 -0.1131236 0.18652275 0.22759461 -0.2423577 -0.21302304 0.2211852 0.088460155 0.2905272 0.098448314 0.037683677 -0.188897; 0.28486097 -0.12169943 0.25768846 -0.083364986 -0.24318339 -0.17375238 0.104363434 -0.30723086 0.15000357 -0.23444234 -0.031826463 -0.11300497 0.07104212 -0.10065443 0.037941605 0.20101821 0.15820405 -0.017900784 -0.042564534 0.06799807 0.27138618 0.30581033 0.17845021 -0.1210763 -0.26190454 -0.28274384 -0.23858918 0.28318846 -0.018996341 -0.20478942 0.26093015 -0.26902777 -0.17823248 0.1581143 0.30644605 0.032155447 0.13251536 0.06852115 0.11990203 -0.112497486 0.04101646 -0.08615059 0.2474977 -0.3000902 -0.23940948 0.20371576 0.20057796 -0.28178334 0.17326465 -0.048196774; 0.14914069 0.14670585 -0.015446652 -0.116284184 0.041281883 0.0071055614 -0.1062253 0.11440433 -0.2740604 0.09227414 -0.1912688 -0.045827158 0.16112284 0.11181162 -0.29875427 -0.108770095 -0.041481905 0.048241407 -0.27756438 -0.08441218 0.21959288 0.266312 0.2392395 0.31213123 -0.10577056 -0.18673536 0.083306216 0.087203175 -0.15281074 -0.08273099 0.22004563 0.07370038 0.021620562 -0.17784415 0.17591228 -0.3030233 -0.048594706 -0.06778896 -0.22242557 -0.0151013825 -0.19828728 -0.26414752 0.3147847 0.21113236 0.18142095 -0.3126612 -0.31456342 0.18144651 0.24431197 0.3130038; -0.07342059 0.19621573 -0.069334775 0.13800906 -0.02106287 -0.15501863 -0.1928503 -0.26985568 -0.07879532 -0.057272885 -0.13090266 0.055047236 -0.2112982 0.11421637 -0.17000881 0.13243061 0.3070214 -0.09134924 -0.24594249 0.011185237 0.2784737 -0.17633362 -0.15088958 -0.20722885 0.1959088 0.20759617 -0.08670942 0.1615121 -0.04625578 -0.00799895 -0.26223108 -0.14788266 -0.01953221 0.13080047 -0.15989138 0.17604487 -0.0937714 -0.030410402 -0.19610445 0.013888472 0.24987376 -0.25276184 0.1376674 0.08271414 0.07816913 -0.16051859 0.27253506 0.3081096 0.14031005 -0.09712804; -0.28071636 -0.24880941 0.09803138 0.009896367 0.24410693 -0.16587919 0.24220495 -0.18663734 -0.0052999747 -0.22532347 0.10025695 -0.023790946 0.27888933 -0.08587668 -0.07731623 -0.16370608 0.30664185 0.27938405 0.12752627 0.0040351176 0.07264697 -0.08879581 0.22512922 0.2769331 0.2576174 0.2627165 -0.018798469 0.16079038 -0.038407166 0.14928314 0.09590925 0.06397546 -0.022286184 -0.052180134 -0.037175708 -0.111218944 0.24139394 0.05724088 -0.12584576 0.2550954 -0.23268232 0.05703147 0.016079701 0.08974318 0.27270362 0.3118786 -0.28198916 0.14490551 -0.11536433 -0.039238542; -0.21618007 0.0067844186 0.13326205 0.30043057 0.26320058 0.102281526 0.14359802 0.17069441 0.06590707 0.05109087 -0.080432326 -0.2735308 -0.28710663 0.0019304404 -0.24037577 0.063144274 0.17988832 -0.04556961 -0.20859326 0.22845803 -0.15843646 -0.2951131 -0.017667625 -0.31289667 0.06827016 0.29621032 0.20056498 0.13080631 -0.09368451 0.27318606 -0.20493923 0.22860852 -0.16404845 0.050472595 -0.17177643 0.13690904 0.029014735 -0.19224791 0.050070893 -0.21164513 0.2046614 0.24119964 0.15863226 -0.053365108 0.28342435 0.12569237 0.31557256 -0.23390156 -0.08811888 0.1432875; 0.22667432 -0.026528975 -0.13344511 -0.009095262 0.10796239 -0.24194847 0.015485932 -0.12029988 -0.09658053 0.25415438 0.16111967 -0.084199265 -0.121348135 0.17489249 -0.10094579 -0.14576836 -0.24340987 -0.2772468 -0.124077074 0.20629023 0.043738317 0.1600703 0.0066329506 0.014307516 0.24471545 -0.011871517 0.11649472 -0.061041296 -0.06395322 0.08084236 0.12290586 0.24088714 0.08745518 0.23536848 -0.08039078 -0.23828018 0.038085833 -0.29668593 -0.0021409043 -0.1954613 0.23526116 -0.009679796 0.024597593 -0.2776047 -0.09459731 0.24198213 0.016321342 -0.13159418 0.21266963 -0.30724055; 0.2669016 -0.1598483 0.29249328 0.19198191 0.30493575 -0.24245375 0.091750525 -0.18476753 -0.089809604 0.1529587 -0.08756186 -0.19301905 0.22358589 0.11155321 0.0018100352 -0.187034 -0.20228738 0.10567809 -0.2881814 -0.12926 -0.1680153 -0.047673725 -0.07178242 0.033855442 -0.018711915 0.16270146 -0.31494737 0.0024156799 -0.03708335 -0.08429935 0.27556106 0.06526674 -0.13904199 -0.017100621 0.07817294 -0.30075735 -0.106721096 0.2308203 -0.16536295 0.12921834 -0.2595945 -0.2847546 0.12741812 -0.30438665 0.24974328 -0.1992584 0.16362254 0.18401843 0.2924148 -0.23765124; 0.31074402 -0.094694115 -0.014838971 0.31381837 0.21834393 0.16682422 -0.13008659 -0.114216484 -0.2225454 -0.18696576 0.28453776 -0.2204235 0.0792651 -0.0626956 -0.29124025 -0.18036447 -0.276397 0.014096636 0.20833458 -0.22381651 -0.07092515 0.25401348 0.25245157 0.22304079 -0.2933343 0.14712267 0.0848653 0.060411524 -0.21958809 0.0035989224 0.110468656 0.23413607 0.16512999 -0.18223132 0.26166636 0.2584796 0.051877197 -0.124807954 -0.2898065 -0.081236295 0.22733893 -0.008287597 0.1422834 -0.25383833 0.13091555 0.23349771 -0.28327826 -0.025150951 -0.123288184 -0.027015157; -0.20909216 -0.22418784 -0.19439164 -0.124106854 -0.12144992 -0.26550457 0.005087701 0.24449344 0.14194635 -0.18293385 -0.16599578 0.015260729 -0.240207 -0.20770033 -0.21506275 0.16361776 0.029939035 0.18481283 0.21695007 0.27632684 0.1700802 -0.0716849 0.24631234 0.28281692 -0.18711068 0.21542114 -0.009494514 -0.20955296 -0.10551056 -0.13941728 -0.0172093 0.03867919 0.07009995 0.274379 -0.22836542 -0.12848076 -0.24079621 0.2974726 -0.184321 -0.05464825 -0.15316848 -0.10142489 0.1481808 0.13654594 0.22446081 0.056623023 0.19530806 0.117109485 -0.015082496 0.13912281], Float32[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]])
Lastly we turn our training data to minibatches, and we can start training
data_loader = Flux.DataLoader((ds_train, y_train), batchsize=BATCH_SIZE, shuffle=true)
10-element DataLoader(::Tuple{Vector{ProductNode{NamedTuple{(:lumo, :inda, :logp, :ind1, :atoms), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bonds, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bond_type, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}}}, Nothing}}, Vector{Int64}}, shuffle=true, batchsize=10)
with first element:
(10-element Vector{ProductNode{NamedTuple{(:lumo, :inda, :logp, :ind1, :atoms), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bonds, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bond_type, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}}}, Nothing}}, 10-element Vector{Int64},)
We can see the accuracy rising and obtaining over 80% quite quickly
for i in 1:10
@info "Epoch $i"
Flux.Optimise.train!(loss, ps, data_loader, opt)
@show accuracy(ds_train, y_train)
end
[ Info: Epoch 1
accuracy(ds_train, y_train) = 0.39
[ Info: Epoch 2
accuracy(ds_train, y_train) = 0.39
[ Info: Epoch 3
accuracy(ds_train, y_train) = 0.34
[ Info: Epoch 4
accuracy(ds_train, y_train) = 0.59
[ Info: Epoch 5
accuracy(ds_train, y_train) = 0.61
[ Info: Epoch 6
accuracy(ds_train, y_train) = 0.61
[ Info: Epoch 7
accuracy(ds_train, y_train) = 0.61
[ Info: Epoch 8
accuracy(ds_train, y_train) = 0.61
[ Info: Epoch 9
accuracy(ds_train, y_train) = 0.61
[ Info: Epoch 10
accuracy(ds_train, y_train) = 0.61
Classify test set
The Last part is inference and evaluation on test data.
dataset_test = MLDatasets.Mutagenesis(split=:test);
x_test = dataset_test.features
y_test = dataset_test.targets
ds_test = extractor.(x_test)
44-element Vector{ProductNode{NamedTuple{(:lumo, :inda, :logp, :ind1, :atoms), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bonds, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, BagNode{ProductNode{NamedTuple{(:element, :bond_type, :charge, :atom_type), Tuple{ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}, ArrayNode{Matrix{Float32}, Nothing}, ArrayNode{OneHotMatrix{UInt32, Vector{UInt32}}, Nothing}}}, Nothing}, AlignedBags{Int64}, Nothing}}}, Nothing}}:
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
ProductNode
we see that the test set accuracy is also over 80%
@show accuracy(ds_test, y_test)
probs = softmax(model(ds_test))
o = Flux.onecold(probs)
mean(o .== y_test .+ 1)
0.6818181818181818
pred_classes
contains the predictions for our test set. we see the accuracy is around 75% on test set predicted classes for test set
o
44-element Vector{Int64}:
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
Ground truth classes for test set
y_test .+ 1
44-element Vector{Int64}:
2
2
2
1
2
2
1
1
2
2
2
1
1
2
2
1
2
2
1
1
1
2
2
2
2
2
2
1
2
2
2
1
1
1
2
2
2
2
2
2
2
2
1
2
probabilities for test set
probs
2×44 Matrix{Float32}:
0.346156 0.348281 0.346863 0.353872 0.357072 0.348715 0.344788 0.336177 0.345915 0.345533 0.346622 0.348019 0.354409 0.346724 0.346259 0.336045 0.345113 0.347214 0.351089 0.336019 0.340851 0.342078 0.346539 0.325408 0.327077 0.323476 0.354072 0.342912 0.391918 0.398457 0.350724 0.351342 0.340165 0.353031 0.345547 0.352617 0.334764 0.345546 0.32557 0.348303 0.323376 0.346685 0.342451 0.336369
0.653844 0.651719 0.653137 0.646128 0.642928 0.651285 0.655212 0.663823 0.654085 0.654467 0.653378 0.651981 0.645591 0.653276 0.653741 0.663955 0.654887 0.652786 0.648911 0.663981 0.659149 0.657922 0.653461 0.674592 0.672923 0.676524 0.645929 0.657088 0.608082 0.601543 0.649276 0.648658 0.659835 0.646969 0.654453 0.647384 0.665236 0.654455 0.67443 0.651697 0.676625 0.653315 0.657549 0.663631
We can look at individual samples. For instance, some sample from test set is
ds_test[2]
ProductNode # 1 obs, 176 bytes
├─── lumo: ArrayNode(99×1 OneHotArray with Bool elements) # 1 obs, 76 bytes
├─── inda: ArrayNode(2×1 OneHotArray with Bool elements) # 1 obs, 76 bytes
├─── logp: ArrayNode(63×1 OneHotArray with Bool elements) # 1 obs, 76 bytes
├─── ind1: ArrayNode(3×1 OneHotArray with Bool elements) # 1 obs, 76 bytes
╰── atoms: BagNode # 1 obs, 176 bytes
╰── ProductNode # 24 obs, 104 bytes
├──── element: ArrayNode(7×24 OneHotArray with Bool elements) # 24 obs, 168 bytes
├────── bonds: BagNode # 24 obs, 496 bytes
│ ╰── ProductNode # 50 obs, 56 bytes
│ ┊
├───── charge: ArrayNode(1×24 Array with Float32 elements) # 24 obs, 144 bytes
╰── atom_type: ArrayNode(29×24 OneHotArray with Bool elements) # 24 obs, 168 bytes
and the corresponding classification is
y_test[2] + 1
2
if you want to see the probability distribution, it can be obtained by applying softmax
to the output of the network.
softmax(model(ds_test[2]))
2×1 Matrix{Float32}:
0.34828082
0.65171915
so we can see that the probability that given sample belongs to the first class is > 60%.
This concludes a simple classifier for JSON data.
But keep in mind the framework is general and given its ability to embed hierarchical data into fixed-size vectors, it can be used for classification, regression, and various other ML tasks.