ate(x))
case BinaryOp("+", x1, x2) => (evaluate(x1) + evaluate(x2))
case BinaryOp("-", x1, x2) => (evaluate(x1) - evaluate(x2))
case BinaryOp("*", x1, x2) => (evaluate(x1) * evaluate(x2))
case BinaryOp("/", x1, x2) => (evaluate(x1) / evaluate(x2))
}
}
}
}
面向Java开发人员的Scala指南 - 构建计算器,第2部分(2)
时间:2011-01-30 Ted Neward
前一篇文章的读者应该还记得,我布置了一个挑战任务,要求改进优化步骤,进一步在树中进行简化处理,而不是像清单 1 中的代码那样停留在最顶层。Lex Spoon 发现了我认为是最简单的优化方法:首先简化树的 “边缘”(每个表达式中的操作数,如果有的话),然后利用简化的结果,再进一步简化顶层的表达式,如清单 2 所示:
清单 2. 简化、再简化
/*
* Lex''s version:
*/
def simplify(e: Expr): Expr = {
// first simplify the subexpressions
val simpSubs = e match {
// Ask each side to simplify
case BinaryOp(op, left, right) => BinaryOp(op, simplify(left), simplify(right))
// Ask the operand to simplify
case UnaryOp(op, operand) => UnaryOp(op, simplify(operand))
// Anything else doesn''t have complexity (no operands to simplify)
case _ => e
}
// now simplify at the top, assuming the components are already simplified
def simplifyTop(x: Expr) = x match {
// Double negation returns the original value
case UnaryOp("-", UnaryOp("-", x)) => x
// Positive returns the original value
case UnaryOp("+", x) => x
// Multiplying x by 1 returns the original value
case BinaryOp("*", x, Number(1)) => x
// Multiplying 1 by x returns the original value
case BinaryOp("*", Number(1), x) => x
// Multiplying x by 0 returns zero
case BinaryOp("*", x, Number(0)) => Number(0)
// Multiplying 0 by x returns zero
case BinaryOp("*", Number(0), x) => Number(0)
// Dividing x by 1 returns the original value
case BinaryOp("/", x, Number(1)) => x
// Dividing x by x returns 1
case BinaryOp("/", x1, x2) if x1 == x2 => Number(1)
// Adding x to 0 returns the original value
case BinaryOp("+", x, Number(0)) => x
// Adding 0 to x returns the original value
case BinaryOp("+", Number(0), x) => x
// Anything else cannot (yet) be simplified
case e => e
}
simplifyTop(simpSubs)
}
在此对 Lex 表示感谢。
面向Java开发人员的Scala指南 - 构建计算器,第2部分(3)
时间:2011-01-30 Ted Neward
解析
现在是构建 DSL 的另一半工作:我们需要构建一段代码,它可以接收某种文本输入并将其转换成一个 AST。这个过程更正式的称呼是解析(parsing)(更准确地说,是标记解释(tokenizing)、词法解析(lexing) 和语法解析) |