k1lib.selector module¶
This module is mainly used internally, although end users can enjoy some of its
benefits too. The idea is to create a tree structure exactly like the given
torch.nn.Module module. With the exact tree structure, we can then select
specific parts of the module, for any purposes that we’d like, hence the main
class’s name is ModuleSelector.
Let’s say you have a Network architecture like this:
class DynamicGate(nn.Module):
   def __init__(self, hiddenDim):
      super().__init__()
      self.lin = nn.Linear(hiddenDim, 1)
      self.sigmoid = nn.Sigmoid()
   def forward(self, x1): return self.sigmoid(self.lin(x1))
class SkipBlock(nn.Module):
   def __init__(self, hiddenDim=10, gate:type=None):
      super().__init__()
      def gen(): return nn.Linear(hiddenDim, hiddenDim), nn.LeakyReLU()
      self.seq = nn.Sequential(*gen(), *gen(), *gen())
      self.gate = gate(hiddenDim) if gate != None else None
   def forward(self, x):
      if self.gate == None: return self.seq(x) + x
      else:
            r = self.gate(x)
            return r*x + (1-r)*self.seq(x)
class Network(nn.Module):
   def __init__(self, hiddenDim=10, blocks=1, block:type=SkipBlock, gate:type=DynamicGate):
      super().__init__()
      layers = []
      layers += [nn.Linear(1, hiddenDim), nn.LeakyReLU()]
      for i in range(blocks): layers += [block(hiddenDim, gate)]
      layers += [nn.Linear(hiddenDim, 1)]
      self.bulk = nn.Sequential(*layers)
   def forward(self, x):
      return self.bulk(x)
New network:
n = Network(); print(n)
Output:
Network(
   (bulk): Sequential(
      (0): Linear(in_features=1, out_features=10, bias=True)
      (1): LeakyReLU(negative_slope=0.01)
      (2): SkipBlock(
         (seq): Sequential(
            (0): Linear(in_features=10, out_features=10, bias=True)
            (1): LeakyReLU(negative_slope=0.01)
            (2): Linear(in_features=10, out_features=10, bias=True)
            (3): LeakyReLU(negative_slope=0.01)
            (4): Linear(in_features=10, out_features=10, bias=True)
            (5): LeakyReLU(negative_slope=0.01)
         )
         (gate): DynamicGate(
            (lin): Linear(in_features=10, out_features=1, bias=True)
            (sigmoid): Sigmoid()
         )
      )
      (3): Linear(in_features=10, out_features=1, bias=True)
   )
)
Creating simple selector:
selector = k1lib.selector.select(n, """
SkipBlock > #seq: propA, propB
SkipBlock LeakyReLU, #gate > #lin: propC
#bulk > #0
"""); print(selector)
Output:
ModuleSelector:
root: Network
   bulk: Sequential
      0: Linear                  all
      1: LeakyReLU
      2: SkipBlock
         seq: Sequential         propA, propB
            0: Linear
            1: LeakyReLU         propC
            2: Linear
            3: LeakyReLU         propC
            4: Linear
            5: LeakyReLU         propC
         gate: DynamicGate
            lin: Linear          propC
            sigmoid: Sigmoid
      3: Linear
So essentially, this is kinda similar to CSS selectors. “#a” will selects any module with name “a”. “b” will selects any module with class name “b”. Inheritance operators, like “a b” (indirect child) and “a > b” (direct child) works the same as in CSS too.
Note
You can also use the asterisk “*” to select everything. So, #a > \* will match
all child of module with name “a”, and #a \* will select everything
recursively under it. In fact, when you first create k1lib.Learner,
the css is “*” to select everything by default
For each selection sentences, you can attach specific properties to it. If no properties are specified, then the property “all” will be used. You can then get a list of selected modules:
for m in selector.modules("propA"):
   print(type(m.nnModule))
Output:
<class 'torch.nn.modules.linear.Linear'>
<class 'torch.nn.modules.container.Sequential'>
Here, it selects any modules with properties “propA” or “all”
There are other methods that are analogues of torch.nn.Module like
named_children() and whatnot.
- 
class 
k1lib.selector.ModuleSelector(parent: k1lib.selector.ModuleSelector, name: str, nnModule: torch.nn.modules.module.Module)[source]¶ Bases:
object- 
property 
displayF¶ Function to display each ModuleSelector’s lines. Default is just:
lambda mS: ", ".join(mS.selectedProps)
- 
selected(prop: Optional[str] = None) → bool[source]¶ Whether this ModuleSelector has a specific prop
- 
named_children() → Iterator[Tuple[str, k1lib.selector.ModuleSelector]][source]¶ Get all named direct child
- 
named_modules(prop: Optional[str] = None) → Iterator[Tuple[str, torch.nn.modules.module.Module]][source]¶ Get all named child recursively
- Parameters
 prop – Filter property
- 
children() → Iterator[k1lib.selector.ModuleSelector][source]¶ Get all direct child
- 
modules(prop: Optional[str] = None) → Iterator[torch.nn.modules.module.Module][source]¶ Get all child recursively. Optional filter prop
- 
property 
directParams¶ Params directly under this module
- 
parameters() → Iterator[torch.nn.parameter.Parameter][source]¶ Get generator of parameters, all depths
- 
property 
deepestDepth¶ Deepest depth of the tree. If self doesn’t have any child, then depth is 0
- 
apply(f: Callable[[k1lib.selector.ModuleSelector], None])¶ Applies a function to self and all child
ModuleSelector
- 
copy()¶ 
- 
property 
 
- 
k1lib.selector.filter(selectors: str, defaultProp='all') → List[str][source]¶ Removes all quirkly features allowed by the css language, and outputs nice lines.
- Parameters
 selectors – single css selector string. Statements separated by “\n” or “;”
defaultProp – default property, if statement doesn’t have one
- 
k1lib.selector.select(model: torch.nn.modules.module.Module, selectors: str) → k1lib.selector.ModuleSelector[source]¶ Creates a new ModuleSelector, in sync with a model