Skip to main content

Overall Design of TensorWeaver

This chapter deep dives into the design philosophy, core framework components, and interaction mechanisms of TensorWeaver.

Core Design Principles

Before diving into specific modules, we first need to understand the design philosophy and core principles of TensorWeaver:

  • Education-First Approach
    • Prioritizes clarity and understanding over performance
    • Every component is designed to be easily understood and debugged
    • Comprehensive documentation explaining implementation details and design choices
  • Simplicity and Readability
    • Written in pure Python with minimal dependencies (primarily NumPy)
    • Clean, well-structured code that's easy to follow
    • Avoids unnecessary complexity while maintaining essential functionality

Keep these principles in mind, we designed TensorWeaver to be a simple, easy-to-understand, and educational framework.

Hierarchical Abstraction

TensorWeaver adopts a hierarchical abstraction design, with each layer having a clear responsibility:

AbstractionComponents
Model ExportONNX
Model Trainingloss function, optimizer, learning rate scheduler
Model BuilderModule, Parameter
Autodiff EngineVariable, Function
Computation Enginenumpy

The advantage of this hierarchical design:

  • Separation of Concerns:Each layer only needs to focus on its own responsibilities
  • Flexibility:Advanced users can directly use the low-level API to implement custom functionality
  • Maintainability:Modifying the implementation of one layer does not affect the interfaces of other layers

Graph Building Approach

Generally speaking, there are two appraoches to build graph. One is dynamic graph is a approach that build the graph on the fly that means the graph is built when you do the computation (sometimes it's called define-by-run). Other approach of graph building is static graph. Static graph is a approach that build the graph in advance before do the real computation.

For now, PyTorch is classical framework that adopts the dynamic graph approach, while TensorFlow is classical framework that adopts the static graph approach. But in recent years, the boundary between dynamic and static graph is becoming more and more blurred.

Though static graph is more efficient and easier to optimize, dynamic graph is more popular in the deep learning community because it's more user-friendly and easier to debug.

TensorWeaver adopts the dynamic graph approach.