# Linear Perceptron in 2D Space

In this post, we will look at the classification of a two-dimensional space using a linear perceptron. We will use the familiar “Two Moons” classifications. In doing so, we will observe the classification by applying various transformations to the two moons.

The architecture of the Neural Net

1. Linear Layer
2. Logistic Sigmoid

I have included the Logistic Sigmoid to smoothen out the output. #### Trained Nets

The following shows the training set and the trained set and its perception of the two-dimensional space. The configurations are also listed on the top of each of the images.        #### Code

I have used Wolfram Mathematica 12.0 for

1. Example set generation
2. Neural Net design
3. Plotting of the images and export of the .png images

Feel free to change the configurations.

```Module[
{configurations = <||>, topKeys},
configurations["NoGapNoOffset"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && #[] > 0 &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && #[] < 0 &)|>;

configurations["GapOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0.1, 0}] <= 0.8 && #[] > 0 &),
"Lower" -> (0.6 <= Norm[# - {0.1, 0}] <= 0.8 && #[] < 0 &)|>;

configurations["HighGapNoOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0, -0.1}] <= 0.8 && #[] > 0.1 &),
"Lower" -> (0.6 <= Norm[# - {0, -0.1}] <= 0.8 && #[] < -0.1 &)|>;

configurations["HighGapOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0.1, -0.1}] <= 0.8 && #[] > 0.1 &),
"Lower" -> (0.6 <= Norm[# - {0.1, -0.1}] <= 0.8 && #[] < -0.1 &)|>;

configurations["NegativeGapNoOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {0, 0.1}] <= 0.8 && #[] > -0.1 &),
"Lower" -> (0.6 <= Norm[# - {0, 0.1}] <= 0.8 && #[] < 0.1 &)|>;

configurations["NegativeGapOffset"] = <|
"Upper" -> (0.6 <= Norm[# + {-0.1, 0.1}] <= 0.8 && #[] > -0.1 &),
"Lower" -> (0.6 <= Norm[# - {-0.1, 0.1}] <= 0.8 && #[] < 0.1 &)|>;

configurations["NoGapRotation"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[] + #[]) > 0) &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[] + #[]) < 0) &)|>;

configurations["GapRotation"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[] + #[]) > 0) &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[] + #[]) < 0) &)|>;

configurations["GapRotation"] = <|
"Upper" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[] + #[]) > 0.1) &),
"Lower" -> (0.6 <= Norm[#] <= 0.8 && ((0.5 #[] + #[]) < -0.1) &)|>;

topKeys = Keys@configurations[[ ;; ]];

Riffle[
Module[{allData, upperMoon, lowerMoon, trainedNet,
preTrainedGraphics, trainedGraphics, combinedGraphics},
topKeys = Keys[configurations];
allData = RandomReal[{-1, 1}, {15000, 2}];
upperMoon = Select[allData, #1];
lowerMoon = Select[allData, #2];

preTrainedGraphics =
Graphics[{PointSize@0.005, Opacity@0.5, Darker@Green,
Point[upperMoon], {PointSize@0.005, Opacity@0.5, Red,
Point[lowerMoon]}}, AspectRatio -> 1,
PlotRange -> {{-1, 1}, {-1, 1}}, ImageSize -> 300,
PlotLabel -> "TrainingSet"];

trainedNet =
NetTrain[NetChain[{LinearLayer, LogisticSigmoid}],
Join @@ {(# -> 1) & /@ upperMoon, (# -> 0) & /@ lowerMoon}];

trainedGraphics =
With[{blends = Join @@
Table[{Blend[{Red, Green}, trainedNet[{x, y}, None][]], PointSize@0.005,
Point[{x, y}]},
{x, -1, 1, 0.025},
{y, -1, 1, 0.025}
]
},
Graphics[
{
{PointSize@0.005, Opacity@0.5, Darker@Green, Point[upperMoon]},
{PointSize@0.005, Opacity@0.5, Red, Point[lowerMoon]}, blends},
ImageSize -> 300,
AspectRatio -> 1,
PlotRange -> {{-1, 1}, {-1, 1}},
PlotLabel -> "TrainedNet's ViewOfSpace"]
];

combinedGraphics = Labeled[Framed[GraphicsRow[{preTrainedGraphics, trainedGraphics}]], #3, Top];
Export[#3 <> ".png", combinedGraphics, ImageSize -> 700, ImageResolution -> 1000]

] &
, {configurations[#]["Upper"] & /@ topKeys, configurations[#]["Lower"] & /@ topKeys, topKeys}
], "\n\n"]

] // Column

```

End of the post

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.