C# project on stardard deviation
Project detail
I want a routine that can determine if a value (we provide) is likely to belong to a larger set of values (which we provide). So the way I think of it is that we determine the standard deviation of the big set of numbers, and find how many standard deviations the number provided is from the normal distribution that results.
The calculation should be done by a .NET stats package that would allow other measures to be used to determine “belonging”.
Additionally other measures of “belonging” will be considered but we should have the standard deviation as the reference.
This is a part of a much bigger project.