The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

L2 Norm

dragoljubdragoljub Member Posts: 241 Contributor II
edited November 2018 in Help
Is there any way to perform an L2 Norm on feature vectors e.g. normalize the vectors to length 1.

The normalize operator seems to only scale within features not across features.

Thanks,
-Gagi

Answers

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi,
    unfortunately this isn't currently possible with a built in operator. If you don't have too many attributes, you might make the calculations with the Generate Attributes operator.
    Why are you going to normalize the complete example this way? The attributes will receive different weights per examples and learning will nearly be impossible.


    Greetings,
      Sebastian
  • dragoljubdragoljub Member Posts: 241 Contributor II
    Actually, depending on the learning task the L2 norm can be quite effective. For example say you have a long vector of features and want to check when only a few of many is abnormally different compared to the population trend. When using dot product kernels for example the L2-Norm can allow you to compute the angle between samples rather quickly (dot between normalized vector = cos(theta)). Also there is the added benefit of working with small feature magnitudes that speeds up computations in LibSVM for example. So thats why I'm using it. There are other examples, and I think it will be a great feature to add to the normalization operator.

    -Gagi
    Sebastian Land wrote:

    Hi,
    unfortunately this isn't currently possible with a built in operator. If you don't have too many attributes, you might make the calculations with the Generate Attributes operator.
    Why are you going to normalize the complete example this way? The attributes will receive different weights per examples and learning will nearly be impossible.


    Greetings,
      Sebastian
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi,
    I thought about this yesterday evening when driving home and I came to exactly the same conclusion. I've added this on the growing stack of good ideas that are still to implement...

    Greetings,
      Sebastian
  • dragoljubdragoljub Member Posts: 241 Contributor II
    Great!  :P
Sign In or Register to comment.