More information about Ace
Papers
Algorithms used in Ace are described in the following papers (papers and bibtex entries can be downloaded here by clicking on “Publications”).
- Compiling Relational Bayesian Networks for Exact Inference. Mark Chavira, Adnan Darwiche, and Manfred Yaeger. International Journal of Approximate Reasoning (IJAR-2006).
- Compiling Bayesian Networks with Local Structure. Mark Chavira and Adnan Darwiche. Proceedings of the 19th International Joint Conference on Artificial Intelligence (IJCAI-2005).
- Exploiting Evidence in Probabilistic Inference. Mark Chavira, David Allen, and Adnan Darwiche. Proceedings of the 21st Conference on Uncertainty in Artificial Intelligence (UAI-2005).
- Solving MAP Exactly by Searching on Compiled Arithmetic Circuits. Jinbo Huang, Mark Chavira, and Adnan Darwiche. Proceedings of the 21st National Conference on Artificial Intelligence (AAAI-06).
- Encoding CNFs to Empower Component Analysis. Mark Chavira and Adnan Darwiche. Proceedings of the Ninth International Conference on Theory and Applications of Satisfiability Testing (SAT-2006).
- On Probabilistic Inference by Weighted Model Counting. Mark Chavira and Adnan Darwiche. Artificial Intelligence Journal (AIJ-2008).
- Compiling Bayesian Networks Using Variable Elimination. Mark Chavira and Adnan Darwiche. Proceedings of the 20th International Joint Conference on Artificial Intelligence (IJCAI-2007).
Other references
The -sbk05 command line option in Ace corresponds to a Bayesian Network encoding proposed in the following paper and implemented in the model counter Cachet.
- Solving Bayesian networks by weighted model counting. Tian Sang, Paul Beame, and Henry Kautz. Proceedings of the Twentieth National Conference on Artificial Intelligence (AAAI-05).
- The cachet model counter: link
Benchmarks used in published results
Many benchmarks from published results are available from the download page and from the University of Washington. If you are interested in a specific benchmark that is not available from these two sources, send email to ace at cs dot ucla dot edu.
Options used in published results
Running Ace requires specifying certain options. These options have defaults and are described in readme.pdf, which is part of the distribution. We list below the options used for many of the experiments in the publications above. In each case, compilation was performed on a machine with 2GB of RAM. Note that both -dtHypergraph and -dtBnMinfill involve randomization and so may not produce the same results from one run to the next.
All networks from reference [1]:
- Edit evaluate.bat to allocate 1200 megabytes of memory
- Compile with: compile -forceC2d -d02 -dtHypergraph 3 foo.net
- Evaluate with: evaluate foo.net foo.inst
Munin1-4 and diabetes from reference [2]:
- Edit evaluate.bat to allocate 1200 megabytes of memory
- Compile with: compile -forceC2d -cd05 -dtClauseMinfill foo.net
- Evaluate with: evaluate foo.net foo.inst
Other networks from reference [2]:
- Edit evaluate.bat to allocate 1200 megabytes of memory
- Compile with: compile -forceC2d -cd05 -dtBnMinfill foo.net
- Evalulate with: evaluate foo.net foo.inst
Networks from reference [7]:
- Edit evaluate.bat to allocate 1200 megabytes of memory
- Compile with: compile -forceC2d -cd06 -dtBnMinfill foo.net
- Evalulate with: evaluate foo.net foo.inst
Mapping between CNF variables and Bayesian Network variables
Specifying certain options to Ace allows one to retain the CNF encoding of the Bayesian network. The mapping between CNF literals and BN variables is included in the file having the suffix “.lmap”. We cover only the basics of this file here. For more detail, you can examine the source file OnlineEngine.java, included with Ace. Additionally, note that not every parameter has a corresponding literal in the CNF (e.g., if the algorithm determines that the parameter cannot influence the result), and for some encodings, multiple parameters can share a single literal (e.g., if they have the same weight and are part of the same potential). For each CNF literal (negative or positive version of a CNF variable), there is exactly one description line containing the mapping for that literal. A description line looks like one of the following:
- “cc” “I” literal weight elimOp srcVarName srcValName srcVal
- “cc” “P” literal weight elimOp srcPotName pos+
- “cc” “C” literal weight elimOp
You can ignore “cc”. I, P, and A indicate that the literal is an indicator, parameter, or auxiliary (a negative version of I or P that is neither I nor P or a variable that is intermediate), respectively. The literal and weight are straightforward. For more information, see the papers above. You can ignore elimOp (it is experimental and suggests the elimination operation: addition, maximization, or not applicable). srcVarName, srcVarVal, and srcVal represent the BN variable’s name, value name, and value (the N values of the BN variable are mapped to ints in [0,N). srcPotName and pos are similar, but apply to potentials: each potential is given a unique name, and parameters are assigned ints starting at 0.
Send questions and comments to ace at cs.ucla.edu.