Julie wrote:
> Is there any algorithms of Bayesian Network to work directly on the
> mixture of continuous and categorical variables?
Yes, in Hugin you can mix the special kind of continuous variables
called "conditional Gaussian" (CG) variables with discrete (categorical)
variables.
CG variables have a normal distribution for each configuration of their
discrete parents where the means depend linearly on the CG parents.
If you want any other kind of continuous variable, you need to
discretize them (and lose accuracy). I don't know exactly how much you
lose if you discretize. If you use many narrow intervals you can of
cause get this down to a minimum.
You should be able to download a free demo of Hugin from our web site at
www.hugin.com.
Best regards,
Lars
> The classification problem that I am working on has 37 input variables, 15
> of them are categorical and the rest of them are continuous. To my
> understanding, I need to discretize the continuous varibles in order to
> apply some commonly used algorithms (such as junction tree) to construct
> and estimate BNs. Since a large portion of the input variables are
> continuous, I am afraid of loss of information by discretizing them.
> References and input on working directly on the mixture will
> be highly appreciated. I would also like to have any comments and
> experiences on how much gain we can get from working on the mixture
> directly over transforming all variables into discrete. Thanks.
>
> Best regards,
>
> Julie
>
-- Lars M. Nielsen Hugin Expert A/S M.Sc. (CS & Math.) Niels Jernes Vej 14 Email: ln@hugin.com DK-9220 Aalborg East Phone: +45 96354548 Fax: +45 96354544
This archive was generated by hypermail 2b29 : Fri Apr 07 2000 - 07:22:26 PDT