mcp titan cognitive memory
Enables neural memory sequence learning with a memory-augmented model for improved code understanding and generation, featuring state management, novelty detection, and model persistence.
Enables neural memory sequence learning with a memory-augmented model for improved code understanding and generation, featuring state management, novelty detection, and model persistence.
Colaboration between @jasonkneen and @ExpressionsBot
Follow us on X - jasonkneen - megaprompt
An implementation inspired by Google Research's paper "Generative AI for Programming: A Common Task Framework". This server provides a neural memory system that can learn and predict sequences while maintaining state through a memory vector, following principles outlined in the research for improved code generation and understanding.
This implementation draws from the concepts presented in the Google Research paper (Muennighoff et al., 2024) which introduces a framework for evaluating and improving code generation models. The Titan Memory Server implements key concepts from the paper:
These features align with the paper's goals of improving code understanding and generation through better memory and state management.
# Install dependencies
npm install
# Build the project
npm run build
# Run tests
npm test
Initialize the Titan Memory model with custom configuration.
{
inputDim?: number; // Input dimension (default: 64)
outputDim?: number; // Output/Memory dimension (default: 64)
}
Perform a single training step with current and next state vectors.
{
x_t: number[]; // Current state vector
x_next: number[]; // Next state vector
}
Run a forward pass through the model with an input vector.
{
x: number[]; // Input vector
}
Save the model to a specified path.
{
path: string; // Path to save the model
}
Load the model from a specified path.
{
path: string; // Path to load the model from
}
Get current model status and configuration.
{} // No parameters required
Train the model on a sequence of vectors.
{
sequence: number[][]; // Array of vectors to train on
}
// Initialize model
await callTool('init_model', { inputDim: 64, outputDim: 64 });
// Train on a sequence
const sequence = [
[1, 0, 0, /* ... */],
[0, 1, 0, /* ... */],
[0, 0, 1, /* ... */]
];
await callTool('train_sequence', { sequence });
// Run forward pass
const result = await callTool('forward_pass', {
x: [1, 0, 0, /* ... */]
});
The project includes comprehensive tests covering: - Model initialization and configuration - Training and forward pass operations - Memory state management - Model persistence - Edge cases and error handling - Tensor cleanup and memory management
Run tests with:
npm test
tf.tidy()
for proper memory managementMIT License - feel free to use and modify as needed!