I had not meant to involve the whole list here. I must have pressed Reply
All button instead of reply.
But since it is in the open, I might as well explain myself. But thank you
for your suggestion goekhan.
My question to Chew Hong Gunn was why the very small values of C+ and C- in
the weight adjusted used in the publication I mentioned. It is a bit
surprising given that for a unweighted case, we do not consider (at least I
think)small values such as C = 0.000 001 and so on. I could suppose this was
so because the ratio C+/C- is large enough to compensate for this but it is
of my own imagination and wanted some confirmation/denial.
Goekhan, thank you for your help. Due to my short timeframe, I do not have
time to implement these manually, and instead I am using the brute force of
computing power. I shall study them in detail when I have time.
Regards, Adai.
----- Original Message -----
From: "Goekhan Bakir" <[log in to unmask]>
To: <[log in to unmask]>
Sent: Thursday, August 22, 2002 1:02 PM
Subject: Re: How to handle unbalanced data
> Why you have to test the whole range from 0.01 to 1000?
> To find a good parameter C, where good means for me
> classification-performance vs number of support vectors, I use a kind of
> binary search. As soon as for a growing C it gets worse i try
> with c_next = 0.5*(c_last+c_now). c_last=c_now.
>
> For c > c_bad I expect an even worse result. But this is a heuristic.
> Comments?
>
> goekhan
>
>
> On Thu, 22 Aug 2002, Adaikalavan Ramasamy wrote:
>
> > Dear Hong Gunn,
> >
> > Thank you for this. I think this is what I was looking for but never
thought
> > to ask about. The first paper is relevant to my dissertation and very
few
> > people have talked about this weighting, instead prefering to talk about
> > adding to the diagonals of the kernel matrix to increase/decrease
> > sparseness.
> >
> > I have one question though, regarding the magnitude of the C+ and C-
> > reported in table 1 of "Target detection in radar imagery using SVM".
You
> > yourself, it is typical to use a value of 1 to 100. But your values in
the
> > table are from 10^-6 to 1. You did not consider any value of C+/C-
greater
> > than 1, say 1000. Why was this ? Is this particular to the dataset you
were
> > using or the recomended practice. I am thinking of testing for values of
> > 0.01, 0.1, 1, 10, 1000. As you noted this is a lengthy process and I
would
> > appreciate your input.
> >
> > Thank you.
> >
> > Regards, Adai.
> >
> >
> > ----- Original Message -----
> > From: "Chew, Hong Gunn" <[log in to unmask]>
> > To: <[log in to unmask]>
> > Sent: Thursday, August 22, 2002 5:23 AM
> > Subject: Re: How to handle unbalanced data
> >
> >
> > > > In SVM Classification
> > > > if the training data are unbalanced,
> > > > e.g., lots of classA (10000) but few classB (100)
> > > > how to justify the data to enhance the SVC model
> > >
> > > What I do is to select a different error weighting for each class,
> > > eg, C+=1, C-=100, to even out the biasing caused by the dataset.
> > >
> > > You might want to have a look at
> > > H.G. Chew, R.E. Bogner, and C.C. Lim. Target detection in radar
imagery
> > > using support vector machines with training size biasing. In
International
> > > Conference on Control, Automation, Robotics and Vision, ICARCV 2000,
> > > Singapore, pages CD-ROM, 2000.
> > > http://www.kernel-machines.org/papers/upload_11483_ICARCV2000-4.ps
> > > and
> > > H.G. Chew, R.E. Bogner, and C.C. Lim. Dual nu-support vector machine
> > > with error rate and training size biasing. In International Conference
on
> > > Acoustics, Speech and Signal Processing, ICASSP 2001, USA, 2001.
> > >
http://www.kernel-machines.org/papers/upload_11513_ICASSP2001-nu-svm-1.ps
> > >
> > > Cheers,
> > > Hong-Gunn
> > >
> >
>
|