Proficient in Angular development with TypeScript, focusing on component architecture and dependency injection patterns.
- You are an Angular developer
- Use Angular CLI for project scaffolding
- Use TypeScript with strict mode enabled
- Use RxJS for state management and async operations
- Use the typical naming conventions:
- Components: .component.ts
- Services: .service.ts
- Pipes: .pipe.ts
- Module: .module.ts
- Test: .spec.ts
- Directives: .directive.ts
You are an experienced data scientist who specializes in Python-based
data science and machine learning. You use the following tools:
- Python 3 as the primary programming language
- PyTorch for deep learning and neural networks
- NumPy for numerical computing and array operations
- Pandas for data manipulation and analysis
- Jupyter for interactive development and visualization
- Conda for environment and package management
- Matplotlib for data visualization and plotting
- You are a PyTorch ML engineer
- Use type hints consistently
- Optimize for readability over premature optimization
- Write modular code, using separate files for models, data loading, training, and evaluation
- Follow PEP8 style guide for Python code
Please create a new Angular component following these guidelines:
- Include JSDoc comments for component and inputs/outputs
- Implement proper lifecycle hooks
- Include TypeScript interfaces for models
- Follow container/presentational component pattern where appropriate
- Include unit tests with Jasmine/Karma in a separate test file
- Make sure to create separate files for any services, pipes, modules, and directives
Please review the current code changes looking for:
- Memory leaks (unsubscribed observables)
- Proper change detection strategy
- Proper use of async pipe
- Proper error handling
Format the review as:
```
## <FILENAME>
- <ISSUE>
...
- <ISSUE>
```
Generate a data processing pipeline with these requirements:
Input:
- Data loading from multiple sources (CSV, SQL, APIs)
- Input validation and schema checks
- Error logging for data quality issues
Processing:
- Standardized cleaning (missing values, outliers, types)
- Memory-efficient operations for large datasets
- Numerical transformations using NumPy
- Feature engineering and aggregations
Quality & Monitoring:
- Data quality checks at key stages
- Validation visualizations with Matplotlib
- Performance monitoring
Structure:
- Modular, documented code with error handling
- Configuration management
- Reproducible in Jupyter notebooks
- Example usage and tests
The user has provided the following information:
Create an exploratory data analysis workflow that includes:
Data Overview:
- Basic statistics (mean, median, std, quartiles)
- Missing values and data types
- Unique value distributions
Visualizations:
- Numerical: histograms, box plots
- Categorical: bar charts, frequency plots
- Relationships: correlation matrices
- Temporal patterns (if applicable)
Quality Assessment:
- Outlier detection
- Data inconsistencies
- Value range validation
Insights & Documentation:
- Key findings summary
- Data quality issues
- Variable relationships
- Next steps recommendations
- Reproducible Jupyter notebook
The user has provided the following information:
Please create a training loop following these guidelines:
- Include validation step
- Add proper device handling (CPU/GPU)
- Implement gradient clipping
- Add learning rate scheduling
- Include early stopping
- Add progress bars using tqdm
- Implement checkpointing
Please create a new PyTorch module following these guidelines:
- Include docstrings for the model class and methods
- Add type hints for all parameters
- Add basic validation in __init__
Please convert this PyTorch module to equations. Use KaTex, surrounding any equations in double dollar signs, like $$E_1 = E_2$$. Your output should include step by step explanations of what happens at each step and a very short explanation of the purpose of that step.
No Data configured
docker run -i --rm mcp/postgres ${{ secrets.palak-chhabra/palaks-angular-assistant/docker/mcp-postgres/POSTGRES_CONNECTION_STRING }}
docker run -i --rm -e SLACK_BOT_TOKEN -e SLACK_TEAM_ID mcp/slack
npx -y @executeautomation/playwright-mcp-server
npx -y @modelcontextprotocol/server-postgres ${{ secrets.palak-chhabra/palaks-angular-assistant/anthropic/postgres-mcp/CONNECTION_STRING }}
docker run --rm -i --mount type=bind,src=${{ secrets.palak-chhabra/palaks-angular-assistant/docker/mcp-git/GIT_DIR }},dst=${{ secrets.palak-chhabra/palaks-angular-assistant/docker/mcp-git/GIT_DIR }} mcp/git