communication between the Agents (bidders) is done
in an untrustworthy environment i.e. public environ-
ment instead of a private environment. Therefore, we
must make sure that the communication between the
agents and the final decision on who has the highest
bid remains secure and private, even if this takes place
in an untrustworthy environment, see Figure 1.
Figure 1: Private Bidding Scenario.
Alice (A), Bob (B) and the Oblivious Third Party
(OTP) are the three parties involved in the model.
Also, Alice and Bob are mobile software agents who
share some random secret with each other. The OTP
is a trusted party that will correctly execute the inputs
a, b given by Alice or Bob, without being able to learn
any information about the inputs. The goal of Alice
and Bob is to determine if a > b. In an e-commerce
setting this means that the maximum price the buyer
(Alice) is willing to pay is greater than the minimum
price the seller (Bob) is willing to receive. Besides
this, they will only reveal their actual bids a, b if it
satisfies the a > b condition, otherwise they will keep
their secrets private.
Our agent bidding model is situated in an untrust-
worthy environment. Due to the hardness of the prob-
lem we assume that this environment will execute the
agents and the code correctly, but it is interested in
what it is processing. This implies that the agents are
not able to use their private keys or any other secret
data. Therefore, we will use the OTP to perform the
comparison of the inputs without revealing it to the
untrustworthy environment.
The Untrustworthy Environment is also called an
honest-but-curious adversary which was introduced
by (Rabin, 1981). Consequently, the environment
is a server executing the mobile code i.e. the mo-
bile agent, while storing all information from the
agents and all communication between the agents.
The server is not a Malicious Environment trying to
modify the mobile code. Our goal is to ensure the
server does not learn the private inputs a, b before the
agents decide privately to reveal their secret inputs
e.g. when the bidding satisfies a > b.
2.2 Contribution
We present a Private Bidding protocol for Mobile
Software agents while these agents are situated in
an Honest-but-Curious environment only by using an
Oblivious Third Party. Our solution mixes the Private
Agent Communication algorithm from (Cartrysse,
2005; Cartrysse and van der Lubbe, 2004) with the
Private Bidding protocol from (Cachin, 1999).
With our contribution we demonstrate that it is pos-
sible to have a secure, efficient and fair private bid-
ding protocol for mobile software agents. Applica-
tions can be found in e-commerce scenarios where the
bidding takes place on a public server and where the
user has no communication with his mobile software
agents.
2.3 Related Work
Private bidding originates from the Millionaires’
problem initially described by (Yao, 1982). Fur-
ther research proved that any multi-party function can
be computed securely even without the help from
a third party (Goldreich, 2002). But, according
to (Algesheimer et al., 2001) secure mobile agents
schemes do not exist when some environment is to
learn information that depends on the agent’s current
state. Therefore, in order to merge mobile agents
with multi-party computation techniques we needed
the help of an Oblivious Third Party.
The concept of combining mobile agents and
making decisions in the encrypted domain origi-
nates from (Sander and Tschudin, 1998a; Sander
and Tschudin, 1998b). They proposed a software
only non-interactive computing with encrypted func-
tions which was based on homomorphic encryption
schemes. We elaborate on their concept by using (El-
Gamal, 1984) as the basis of our agent communica-
tion algorithm. We also manifest that the ElGamal
encryption in our model can be used with the homo-
morphic in addition property without loosing security.
The Non-Interactive CryptoComputing for N C
1
by (Sander et al., 1999) also discusses computing
with encrypted functions w.r.t. mobile agent systems.
Their solution is capable of comparing two private in-
puts but it does not provide fairness because the inputs
must be encrypted by one of the agents’ public keys.
Consequently, the results will only be available to one
of agents. Also, the private key to decrypt the result of
the comparison cannot be used in the untrustworthy
environment and therefore the agent must leave this
environment to learn the result of the computation.
SECRYPT 2006 - INTERNATIONAL CONFERENCE ON SECURITY AND CRYPTOGRAPHY
278