Signing your rights away is a legal term that refers to the act of giving up your legal rights or claims to something.
This can be done in a variety of ways, such as signing a contract, waiver, or release. Once you have signed your rights away, you will no longer be able to enforce them against the other party.