mirror of
https://github.com/apache/superset.git
synced 2026-05-02 06:24:37 +00:00
Compare commits
18 Commits
docs/testi
...
v2021.3.7
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
2511f9e8d2 | ||
|
|
01453a9293 | ||
|
|
239625656f | ||
|
|
4820175786 | ||
|
|
dee44e7de6 | ||
|
|
a492f27cbc | ||
|
|
668f60ba7e | ||
|
|
489d341688 | ||
|
|
6e8d360525 | ||
|
|
d237cd63c0 | ||
|
|
233135e8ec | ||
|
|
1d2bf882f9 | ||
|
|
8833a6e37e | ||
|
|
2cab564bb7 | ||
|
|
f0e7b8b7a1 | ||
|
|
e981a18afd | ||
|
|
f2f22638de | ||
|
|
75a329abb3 |
53
.asf.yaml
53
.asf.yaml
@@ -17,14 +17,7 @@
|
||||
|
||||
# https://cwiki.apache.org/confluence/display/INFRA/.asf.yaml+features+for+git+repositories
|
||||
---
|
||||
notifications:
|
||||
commits: commits@superset.apache.org
|
||||
issues: notifications@superset.apache.org
|
||||
pullrequests: notifications@superset.apache.org
|
||||
discussions: notifications@superset.apache.org
|
||||
|
||||
github:
|
||||
del_branch_on_merge: true
|
||||
description: "Apache Superset is a Data Visualization and Data Exploration Platform"
|
||||
homepage: https://superset.apache.org/
|
||||
labels:
|
||||
@@ -54,54 +47,8 @@ github:
|
||||
projects: true
|
||||
# Enable wiki for documentation
|
||||
wiki: true
|
||||
# Enable discussions
|
||||
discussions: true
|
||||
|
||||
enabled_merge_buttons:
|
||||
squash: true
|
||||
merge: false
|
||||
rebase: false
|
||||
|
||||
ghp_branch: gh-pages
|
||||
ghp_path: /
|
||||
|
||||
protected_branches:
|
||||
master:
|
||||
required_status_checks:
|
||||
# strict means "Require branches to be up to date before merging".
|
||||
strict: false
|
||||
# contexts are the names of checks that must pass
|
||||
# unfortunately AFAICT for `matrix:` jobs, we have to itemize every
|
||||
# combination here.
|
||||
contexts:
|
||||
- lint-check
|
||||
- cypress-matrix (0, chrome)
|
||||
- cypress-matrix (1, chrome)
|
||||
- cypress-matrix (2, chrome)
|
||||
- cypress-matrix (3, chrome)
|
||||
- cypress-matrix (4, chrome)
|
||||
- cypress-matrix (5, chrome)
|
||||
- dependency-review
|
||||
- frontend-build
|
||||
- pre-commit (current)
|
||||
- pre-commit (previous)
|
||||
- test-mysql
|
||||
- test-postgres (current)
|
||||
- test-postgres-hive
|
||||
- test-postgres-presto
|
||||
- test-sqlite
|
||||
- unit-tests (current)
|
||||
|
||||
required_pull_request_reviews:
|
||||
dismiss_stale_reviews: false
|
||||
require_code_owner_reviews: true
|
||||
required_approving_review_count: 1
|
||||
|
||||
required_signatures: false
|
||||
gh-pages:
|
||||
required_pull_request_reviews:
|
||||
dismiss_stale_reviews: false
|
||||
require_code_owner_reviews: true
|
||||
required_approving_review_count: 1
|
||||
|
||||
required_signatures: false
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
# JavaScript to TypeScript Migration Command
|
||||
|
||||
## Usage
|
||||
```
|
||||
/js-to-ts <core-filename>
|
||||
```
|
||||
- `<core-filename>` - Path to CORE file relative to `superset-frontend/` (e.g., `src/utils/common.js`, `src/middleware/loggerMiddleware.js`)
|
||||
|
||||
## Agent Instructions
|
||||
**See:** [../projects/js-to-ts/AGENT.md](../projects/js-to-ts/AGENT.md) for complete migration guide.
|
||||
@@ -1,684 +0,0 @@
|
||||
# JavaScript to TypeScript Migration Agent Guide
|
||||
|
||||
**Complete technical reference for converting JavaScript/JSX files to TypeScript/TSX in Apache Superset frontend.**
|
||||
|
||||
**Agent Role:** Atomic migration unit - migrate the core file + ALL related tests/mocks as one cohesive unit. Use `git mv` to preserve history, NO `git commit`. NO global import changes. Report results upon completion.
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Migration Principles
|
||||
|
||||
1. **Atomic migration units** - Core file + all related tests/mocks migrate together
|
||||
2. **Zero `any` types** - Use proper TypeScript throughout
|
||||
3. **Leverage existing types** - Reuse established definitions
|
||||
4. **Type inheritance** - Derivatives extend base component types
|
||||
5. **Strategic placement** - File types for maximum discoverability
|
||||
6. **Surgical improvements** - Enhance existing types during migration
|
||||
|
||||
---
|
||||
|
||||
## Step 0: Dependency Check (MANDATORY)
|
||||
|
||||
**Command:**
|
||||
```bash
|
||||
grep -E "from '\.\./.*\.jsx?'|from '\./.*\.jsx?'|from 'src/.*\.jsx?'" superset-frontend/{filename}
|
||||
```
|
||||
|
||||
**Decision:**
|
||||
- ✅ No matches → Proceed with atomic migration (core + tests + mocks)
|
||||
- ❌ Matches found → EXIT with dependency report (see format below)
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Identify Related Files (REQUIRED)
|
||||
|
||||
**Atomic Migration Scope:**
|
||||
For core file `src/utils/example.js`, also migrate:
|
||||
- `src/utils/example.test.js` / `src/utils/example.test.jsx`
|
||||
- `src/utils/example.spec.js` / `src/utils/example.spec.jsx`
|
||||
- `src/utils/__mocks__/example.js`
|
||||
- Any other related test/mock files found by pattern matching
|
||||
|
||||
**Find all related test and mock files:**
|
||||
```bash
|
||||
# Pattern-based search for related files
|
||||
basename=$(basename {filename} .js)
|
||||
dirname=$(dirname superset-frontend/{filename})
|
||||
|
||||
# Find test files
|
||||
find "$dirname" -name "${basename}.test.js" -o -name "${basename}.test.jsx"
|
||||
find "$dirname" -name "${basename}.spec.js" -o -name "${basename}.spec.jsx"
|
||||
|
||||
# Find mock files
|
||||
find "$dirname" -name "__mocks__/${basename}.js"
|
||||
find "$dirname" -name "${basename}.mock.js"
|
||||
```
|
||||
|
||||
**Migration Requirement:** All discovered related files MUST be migrated together as one atomic unit.
|
||||
|
||||
**Test File Creation:** If NO test files exist for the core file, CREATE a minimal test file using the following pattern:
|
||||
- Location: Same directory as core file
|
||||
- Name: `{basename}.test.ts` (e.g., `DebouncedMessageQueue.test.ts`)
|
||||
- Content: Basic test structure importing and testing the main functionality
|
||||
- Use proper TypeScript types in test file
|
||||
|
||||
---
|
||||
|
||||
## 🗺️ Type Reference Map
|
||||
|
||||
### From `@superset-ui/core`
|
||||
```typescript
|
||||
// Data & Query
|
||||
QueryFormData, QueryData, JsonObject, AnnotationData, AdhocMetric
|
||||
LatestQueryFormData, GenericDataType, DatasourceType, ExtraFormData
|
||||
DataMaskStateWithId, NativeFilterScope, NativeFiltersState, NativeFilterTarget
|
||||
|
||||
// UI & Theme
|
||||
FeatureFlagMap, LanguagePack, ColorSchemeConfig, SequentialSchemeConfig
|
||||
```
|
||||
|
||||
### From `@superset-ui/chart-controls`
|
||||
```typescript
|
||||
Dataset, ColumnMeta, ControlStateMapping
|
||||
```
|
||||
|
||||
### From Local Types (`src/types/`)
|
||||
```typescript
|
||||
// Authentication
|
||||
User, UserWithPermissionsAndRoles, BootstrapUser, PermissionsAndRoles
|
||||
|
||||
// Dashboard
|
||||
Dashboard, DashboardState, DashboardInfo, DashboardLayout, LayoutItem
|
||||
ComponentType, ChartConfiguration, ActiveFilters
|
||||
|
||||
// Charts
|
||||
Chart, ChartState, ChartStatus, ChartLinkedDashboard, Slice, SaveActionType
|
||||
|
||||
// Data
|
||||
Datasource, Database, Owner, Role
|
||||
|
||||
// UI Components
|
||||
TagType, FavoriteStatus, Filter, ImportResourceName
|
||||
```
|
||||
|
||||
### From Domain Types
|
||||
```typescript
|
||||
// src/dashboard/types.ts
|
||||
RootState, ChartsState, DatasourcesState, FilterBarOrientation
|
||||
ChartCrossFiltersConfig, ActiveTabs, MenuKeys
|
||||
|
||||
// src/explore/types.ts
|
||||
ExplorePageInitialData, ExplorePageState, ExploreResponsePayload, OptionSortType
|
||||
|
||||
// src/SqlLab/types.ts
|
||||
[SQL Lab specific types]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🏗️ Type Organization Strategy
|
||||
|
||||
### Type Placement Hierarchy
|
||||
|
||||
1. **Component-Colocated** (90% of cases)
|
||||
```typescript
|
||||
// Same file as component
|
||||
interface MyComponentProps {
|
||||
title: string;
|
||||
onClick: () => void;
|
||||
}
|
||||
```
|
||||
|
||||
2. **Feature-Shared**
|
||||
```typescript
|
||||
// src/[domain]/components/[Feature]/types.ts
|
||||
export interface FilterConfiguration {
|
||||
filterId: string;
|
||||
targets: NativeFilterTarget[];
|
||||
}
|
||||
```
|
||||
|
||||
3. **Domain-Wide**
|
||||
```typescript
|
||||
// src/[domain]/types.ts
|
||||
export interface ExploreFormData extends QueryFormData {
|
||||
viz_type: string;
|
||||
}
|
||||
```
|
||||
|
||||
4. **Global**
|
||||
```typescript
|
||||
// src/types/[TypeName].ts
|
||||
export interface ApiResponse<T> {
|
||||
result: T;
|
||||
count?: number;
|
||||
}
|
||||
```
|
||||
|
||||
### Type Discovery Commands
|
||||
```bash
|
||||
# Search existing types before creating
|
||||
find superset-frontend/src -name "types.ts" -exec grep -l "[TypeConcept]" {} \;
|
||||
grep -r "interface.*Props\|type.*Props" superset-frontend/src/
|
||||
```
|
||||
|
||||
### Derivative Component Patterns
|
||||
|
||||
**Rule:** Components that extend others should extend their type interfaces.
|
||||
|
||||
```typescript
|
||||
// ✅ Base component type
|
||||
interface SelectProps {
|
||||
value: string | number;
|
||||
options: SelectOption[];
|
||||
onChange: (value: string | number) => void;
|
||||
disabled?: boolean;
|
||||
}
|
||||
|
||||
// ✅ Derivative extends base
|
||||
interface ChartSelectProps extends SelectProps {
|
||||
charts: Chart[];
|
||||
onChartSelect: (chart: Chart) => void;
|
||||
}
|
||||
|
||||
// ✅ Derivative with modified props
|
||||
interface DatabaseSelectProps extends Omit<SelectProps, 'value' | 'onChange'> {
|
||||
value: number; // Narrowed type
|
||||
onChange: (databaseId: number) => void; // Specific signature
|
||||
}
|
||||
```
|
||||
|
||||
**Common Patterns:**
|
||||
- **Extension:** `extends BaseProps` - adds new props
|
||||
- **Omission:** `Omit<BaseProps, 'prop'>` - removes props
|
||||
- **Modification:** `Omit<BaseProps, 'prop'> & { prop: NewType }` - changes prop type
|
||||
- **Restriction:** Override with narrower types (union → specific)
|
||||
|
||||
---
|
||||
|
||||
## 📋 Migration Recipe
|
||||
|
||||
### Step 2: File Conversion
|
||||
```bash
|
||||
# Use git mv to preserve history
|
||||
git mv component.js component.ts
|
||||
git mv Component.jsx Component.tsx
|
||||
```
|
||||
|
||||
### Step 3: Import & Type Setup
|
||||
```typescript
|
||||
// Import order (enforced by linting)
|
||||
import { FC, ReactNode } from 'react';
|
||||
import { JsonObject, QueryFormData } from '@superset-ui/core';
|
||||
import { Dataset } from '@superset-ui/chart-controls';
|
||||
import type { Dashboard } from 'src/types/Dashboard';
|
||||
```
|
||||
|
||||
### Step 4: Function & Component Typing
|
||||
```typescript
|
||||
// Functions with proper parameter/return types
|
||||
export function processData(
|
||||
data: Dataset[],
|
||||
config: JsonObject
|
||||
): ProcessedData[] {
|
||||
// implementation
|
||||
}
|
||||
|
||||
// Component props with inheritance
|
||||
interface ComponentProps extends BaseProps {
|
||||
data: Chart[];
|
||||
onSelect: (id: number) => void;
|
||||
}
|
||||
|
||||
const Component: FC<ComponentProps> = ({ data, onSelect }) => {
|
||||
// implementation
|
||||
};
|
||||
```
|
||||
|
||||
### Step 5: State & Redux Typing
|
||||
```typescript
|
||||
// Hooks with specific types
|
||||
const [data, setData] = useState<Chart[]>([]);
|
||||
const [selected, setSelected] = useState<number | null>(null);
|
||||
|
||||
// Redux with existing RootState
|
||||
const mapStateToProps = (state: RootState) => ({
|
||||
charts: state.charts,
|
||||
user: state.user,
|
||||
});
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🧠 Type Debugging Strategies (Real-World Learnings)
|
||||
|
||||
### The Evolution of Type Approaches
|
||||
When you hit type errors, follow this debugging evolution:
|
||||
|
||||
#### 1. ❌ Idealized Union Types (First Attempt)
|
||||
```typescript
|
||||
// Looks clean but doesn't match reality
|
||||
type DatasourceInput = Datasource | QueryEditor;
|
||||
```
|
||||
**Problem**: Real calling sites pass variations, not exact types.
|
||||
|
||||
#### 2. ❌ Overly Precise Types (Second Attempt)
|
||||
```typescript
|
||||
// Tried to match exact calling signatures
|
||||
type DatasourceInput =
|
||||
| IDatasource // From DatasourcePanel
|
||||
| (QueryEditor & { columns: ColumnMeta[] }); // From SaveQuery
|
||||
```
|
||||
**Problem**: Too rigid, doesn't handle legacy variations.
|
||||
|
||||
#### 3. ✅ Flexible Interface (Final Solution)
|
||||
```typescript
|
||||
// Captures what the function actually needs
|
||||
interface DatasourceInput {
|
||||
name?: string | null; // Allow null for compatibility
|
||||
datasource_name?: string | null; // Legacy variations
|
||||
columns?: any[]; // Multiple column types accepted
|
||||
database?: { id?: number };
|
||||
// ... other optional properties
|
||||
}
|
||||
```
|
||||
**Success**: Works with all calling sites, focuses on function needs.
|
||||
|
||||
### Type Debugging Process
|
||||
1. **Start with compilation errors** - they show exact mismatches
|
||||
2. **Examine actual usage** - look at calling sites, not idealized types
|
||||
3. **Build flexible interfaces** - capture what functions need, not rigid contracts
|
||||
4. **Iterate based on downstream validation** - let calling sites guide your types
|
||||
|
||||
---
|
||||
|
||||
## 🚨 Anti-Patterns to Avoid
|
||||
|
||||
```typescript
|
||||
// ❌ Never use any
|
||||
const obj: any = {};
|
||||
|
||||
// ✅ Use proper types
|
||||
const obj: Record<string, JsonObject> = {};
|
||||
|
||||
// ❌ Don't recreate base component props
|
||||
interface ChartSelectProps {
|
||||
value: string; // Duplicated from SelectProps
|
||||
onChange: () => void; // Duplicated from SelectProps
|
||||
charts: Chart[]; // New prop
|
||||
}
|
||||
|
||||
// ✅ Inherit and extend
|
||||
interface ChartSelectProps extends SelectProps {
|
||||
charts: Chart[]; // Only new props
|
||||
}
|
||||
|
||||
// ❌ Don't create ad-hoc type variations
|
||||
interface UserInfo {
|
||||
name: string;
|
||||
email: string;
|
||||
}
|
||||
|
||||
// ✅ Extend existing types (DRY principle)
|
||||
import { User } from 'src/types/bootstrapTypes';
|
||||
type UserDisplayInfo = Pick<User, 'firstName' | 'lastName' | 'email'>;
|
||||
|
||||
// ❌ Don't create overly rigid unions
|
||||
type StrictInput = ExactTypeA | ExactTypeB;
|
||||
|
||||
// ✅ Create flexible interfaces for function parameters
|
||||
interface FlexibleInput {
|
||||
// Focus on what the function actually needs
|
||||
commonProperty: string;
|
||||
optionalVariations?: any; // Allow for legacy variations
|
||||
}
|
||||
```
|
||||
|
||||
## 📍 DRY Type Guidelines (WHERE TYPES BELONG)
|
||||
|
||||
### Type Placement Rules
|
||||
**CRITICAL**: Type variations must live close to where they belong, not scattered across files.
|
||||
|
||||
#### ✅ Proper Type Organization
|
||||
```typescript
|
||||
// ❌ Don't create one-off interfaces in utility files
|
||||
// src/utils/datasourceUtils.ts
|
||||
interface DatasourceInput { /* custom interface */ } // Wrong!
|
||||
|
||||
// ✅ Use existing types or extend them in their proper domain
|
||||
// src/utils/datasourceUtils.ts
|
||||
import { IDatasource } from 'src/explore/components/DatasourcePanel';
|
||||
import { QueryEditor } from 'src/SqlLab/types';
|
||||
|
||||
// Create flexible interface that references existing types
|
||||
interface FlexibleDatasourceInput {
|
||||
// Properties that actually exist across variations
|
||||
}
|
||||
```
|
||||
|
||||
#### Type Location Hierarchy
|
||||
1. **Domain Types**: `src/{domain}/types.ts` (dashboard, explore, SqlLab)
|
||||
2. **Component Types**: Co-located with components
|
||||
3. **Global Types**: `src/types/` directory
|
||||
4. **Utility Types**: Only when they truly don't belong elsewhere
|
||||
|
||||
#### ✅ DRY Type Patterns
|
||||
```typescript
|
||||
// ✅ Extend existing domain types
|
||||
interface SaveQueryData extends Pick<QueryEditor, 'sql' | 'dbId' | 'catalog'> {
|
||||
columns: ColumnMeta[]; // Add what's needed
|
||||
}
|
||||
|
||||
// ✅ Create flexible interfaces for cross-domain utilities
|
||||
interface CrossDomainInput {
|
||||
// Common properties that exist across different source types
|
||||
name?: string | null; // Accommodate legacy null values
|
||||
// Only include properties the function actually uses
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🎯 PropTypes Auto-Generation (Elegant Approach)
|
||||
|
||||
**IMPORTANT**: Superset has `babel-plugin-typescript-to-proptypes` configured to automatically generate PropTypes from TypeScript interfaces. Use this instead of manual PropTypes duplication!
|
||||
|
||||
### ❌ Manual PropTypes Duplication (Avoid This)
|
||||
```typescript
|
||||
export interface MyComponentProps {
|
||||
title: string;
|
||||
count?: number;
|
||||
}
|
||||
|
||||
// 8+ lines of manual PropTypes duplication 😱
|
||||
const propTypes = PropTypes.shape({
|
||||
title: PropTypes.string.isRequired,
|
||||
count: PropTypes.number,
|
||||
});
|
||||
|
||||
export default propTypes;
|
||||
```
|
||||
|
||||
### ✅ Auto-Generated PropTypes (Use This)
|
||||
```typescript
|
||||
import { InferProps } from 'prop-types';
|
||||
|
||||
export interface MyComponentProps {
|
||||
title: string;
|
||||
count?: number;
|
||||
}
|
||||
|
||||
// Single validator function - babel plugin auto-generates PropTypes! ✨
|
||||
export default function MyComponentValidator(props: MyComponentProps) {
|
||||
return null; // PropTypes auto-assigned by babel-plugin-typescript-to-proptypes
|
||||
}
|
||||
|
||||
// Optional: For consumers needing PropTypes type inference
|
||||
export type MyComponentPropsInferred = InferProps<typeof MyComponentValidator>;
|
||||
```
|
||||
|
||||
### Migration Pattern for Type-Only Files
|
||||
|
||||
**When migrating type-only files with manual PropTypes:**
|
||||
|
||||
1. **Keep the TypeScript interfaces** (single source of truth)
|
||||
2. **Replace manual PropTypes** with validator function
|
||||
3. **Remove PropTypes imports** and manual shape definitions
|
||||
4. **Add InferProps import** if type inference needed
|
||||
|
||||
**Example Migration:**
|
||||
```typescript
|
||||
// Before: 25+ lines with manual PropTypes duplication
|
||||
export interface AdhocFilterType { /* ... */ }
|
||||
const adhocFilterTypePropTypes = PropTypes.oneOfType([...]);
|
||||
|
||||
// After: 3 lines with auto-generation
|
||||
export interface AdhocFilterType { /* ... */ }
|
||||
export default function AdhocFilterValidator(props: { filter: AdhocFilterType }) {
|
||||
return null; // Auto-generated PropTypes by babel plugin
|
||||
}
|
||||
```
|
||||
|
||||
### Component PropTypes Pattern
|
||||
|
||||
**For React components, the babel plugin works automatically:**
|
||||
|
||||
```typescript
|
||||
interface ComponentProps {
|
||||
title: string;
|
||||
onClick: () => void;
|
||||
}
|
||||
|
||||
const MyComponent: FC<ComponentProps> = ({ title, onClick }) => {
|
||||
// Component implementation
|
||||
};
|
||||
|
||||
// PropTypes automatically generated by babel plugin - no manual work needed!
|
||||
export default MyComponent;
|
||||
```
|
||||
|
||||
### Auto-Generation Benefits
|
||||
|
||||
- ✅ **Single source of truth**: TypeScript interfaces drive PropTypes
|
||||
- ✅ **No duplication**: Eliminate 15-20 lines of manual PropTypes code
|
||||
- ✅ **Automatic updates**: Changes to TypeScript automatically update PropTypes
|
||||
- ✅ **Type safety**: Compile-time checking ensures PropTypes match interfaces
|
||||
- ✅ **Backward compatibility**: Existing JavaScript components continue working
|
||||
|
||||
### Babel Plugin Configuration
|
||||
|
||||
The plugin is already configured in `babel.config.js`:
|
||||
```javascript
|
||||
['babel-plugin-typescript-to-proptypes', { loose: true }]
|
||||
```
|
||||
|
||||
**No additional setup required** - just use TypeScript interfaces and the plugin handles the rest!
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Test File Migration Patterns
|
||||
|
||||
### Test File Priority
|
||||
- **Always migrate test files** alongside production files
|
||||
- **Test files are often leaf nodes** - good starting candidates
|
||||
- **Create tests if missing** - Leverage new TypeScript types for better test coverage
|
||||
|
||||
### Test-Specific Type Patterns
|
||||
```typescript
|
||||
// Mock interfaces for testing
|
||||
interface MockStore {
|
||||
getState: () => Partial<RootState>; // Partial allows minimal mocking
|
||||
}
|
||||
|
||||
// Type-safe mocking for complex objects
|
||||
const mockDashboardInfo: Partial<DashboardInfo> as DashboardInfo = {
|
||||
id: 123,
|
||||
json_metadata: '{}',
|
||||
};
|
||||
|
||||
// Sinon stub typing
|
||||
let postStub: sinon.SinonStub;
|
||||
beforeEach(() => {
|
||||
postStub = sinon.stub(SupersetClient, 'post');
|
||||
});
|
||||
|
||||
// Use stub reference instead of original method
|
||||
expect(postStub.callCount).toBe(1);
|
||||
expect(postStub.getCall(0).args[0].endpoint).toMatch('/api/');
|
||||
```
|
||||
|
||||
### Test Migration Recipe
|
||||
1. **Migrate production file first** (if both need migration)
|
||||
2. **Update test imports** to point to `.ts/.tsx` files
|
||||
3. **Add proper mock typing** using `Partial<T> as T` pattern
|
||||
4. **Fix stub typing** - Use stub references, not original methods
|
||||
5. **Verify all tests pass** with TypeScript compilation
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Type Conflict Resolution
|
||||
|
||||
### Multiple Type Definitions Issue
|
||||
**Problem**: Same type name defined in multiple files causes compilation errors.
|
||||
|
||||
**Example**: `DashboardInfo` defined in both:
|
||||
- `src/dashboard/reducers/types.ts` (minimal)
|
||||
- `src/dashboard/components/Header/types.ts` (different shape)
|
||||
- `src/dashboard/types.ts` (complete - used by RootState)
|
||||
|
||||
### Resolution Strategy
|
||||
1. **Identify the authoritative type**:
|
||||
```bash
|
||||
# Find which type is used by RootState/main interfaces
|
||||
grep -r "DashboardInfo" src/dashboard/types.ts
|
||||
```
|
||||
|
||||
2. **Use import from authoritative source**:
|
||||
```typescript
|
||||
// ✅ Import from main domain types
|
||||
import { RootState, DashboardInfo } from 'src/dashboard/types';
|
||||
|
||||
// ❌ Don't import from component-specific files
|
||||
import { DashboardInfo } from 'src/dashboard/components/Header/types';
|
||||
```
|
||||
|
||||
3. **Mock complex types in tests**:
|
||||
```typescript
|
||||
// For testing - provide minimal required fields
|
||||
const mockInfo: Partial<DashboardInfo> as DashboardInfo = {
|
||||
id: 123,
|
||||
json_metadata: '{}',
|
||||
// Only provide fields actually used in test
|
||||
};
|
||||
```
|
||||
|
||||
### Type Hierarchy Discovery Commands
|
||||
```bash
|
||||
# Find all definitions of a type
|
||||
grep -r "interface.*TypeName\|type.*TypeName" src/
|
||||
|
||||
# Find import usage patterns
|
||||
grep -r "import.*TypeName" src/
|
||||
|
||||
# Check what RootState uses
|
||||
grep -A 10 -B 10 "TypeName" src/*/types.ts
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Agent Constraints (CRITICAL)
|
||||
|
||||
1. **Use git mv** - Run `git mv file.js file.ts` to preserve git history, but NO `git commit`
|
||||
2. **NO global import changes** - Don't update imports across codebase
|
||||
3. **Type files OK** - Can modify existing type files to improve/align types
|
||||
4. **Single-File TypeScript Validation** (CRITICAL) - tsc has known issues with multi-file compilation:
|
||||
- **Core Issue**: TypeScript's `tsc` has documented problems validating multiple files simultaneously in complex projects
|
||||
- **Solution**: ALWAYS validate files one at a time using individual `tsc` calls
|
||||
- **Command Pattern**: `cd superset-frontend && npx tscw --noEmit --allowJs --composite false --project tsconfig.json {single-file-path}`
|
||||
- **Why**: Multi-file validation can produce false positives, miss real errors, and conflict during parallel agent execution
|
||||
5. **Downstream Impact Validation** (CRITICAL) - Your migration affects calling sites:
|
||||
- **Find downstream files**: `find superset-frontend/src -name "*.tsx" -o -name "*.ts" | xargs grep -l "your-core-filename" 2>/dev/null || echo "No files found"`
|
||||
- **Validate each downstream file individually**: `cd superset-frontend && npx tscw --noEmit --allowJs --composite false --project tsconfig.json {each-downstream-file}`
|
||||
- **Fix type mismatches** you introduced in calling sites
|
||||
- **NEVER ignore downstream errors** - they indicate your types don't match reality
|
||||
6. **Avoid Project-Wide Validation During Migration**:
|
||||
- **NEVER use `npm run type`** during parallel agent execution - produces unreliable results
|
||||
- **Single-file validation is authoritative** - trust individual file checks over project-wide scans
|
||||
6. **ESLint validation** - Run `npm run eslint -- --fix {file}` for each migrated file to auto-fix formatting/linting issues
|
||||
6. Zero `any` types - use proper TypeScript types
|
||||
7. Search existing types before creating new ones
|
||||
8. Follow patterns from this guide
|
||||
|
||||
---
|
||||
|
||||
## Success Report Format
|
||||
|
||||
```
|
||||
SUCCESS: Atomic Migration of {core-filename}
|
||||
|
||||
## Files Migrated (Atomic Unit)
|
||||
- Core: {core-filename} → {core-filename.ts/tsx}
|
||||
- Tests: {list-of-test-files} → {list-of-test-files.ts/tsx} OR "CREATED: {basename}.test.ts"
|
||||
- Mocks: {list-of-mock-files} → {list-of-mock-files.ts}
|
||||
- Type files modified: {list-of-type-files}
|
||||
|
||||
## Types Created/Improved
|
||||
- {TypeName}: {location} ({scope}) - {rationale}
|
||||
- {ExistingType}: enhanced in {location} - {improvement-description}
|
||||
|
||||
## Documentation Recommendations
|
||||
- ADD_TO_DIRECTORY: {TypeName} - {reason}
|
||||
- NO_DOCUMENTATION: {TypeName} - {reason}
|
||||
|
||||
## Quality Validation
|
||||
- **Single-File TypeScript Validation**: ✅ PASS - Core files individually validated
|
||||
- Core file: `npx tscw --noEmit --allowJs --composite false --project tsconfig.json {core-file}`
|
||||
- Test files: `npx tscw --noEmit --allowJs --composite false --project tsconfig.json {test-file}` (if exists)
|
||||
- **Downstream Impact Check**: ✅ PASS - Found {N} files importing this module, all validate individually
|
||||
- Downstream files: {list-of-files-that-import-your-module}
|
||||
- Individual validation: `npx tscw --noEmit --allowJs --composite false --project tsconfig.json {each-downstream-file}`
|
||||
- **ESLint validation**: ✅ PASS (using `npm run eslint -- --fix {files}` to auto-fix formatting)
|
||||
- **Zero any types**: ✅ PASS
|
||||
- **Local imports resolved**: ✅ PASS
|
||||
- **Functionality preserved**: ✅ PASS
|
||||
- **Tests pass** (if test file): ✅ PASS
|
||||
- **Follow-up action required**: {YES/NO}
|
||||
|
||||
## Validation Strategy Notes
|
||||
- **Single-file approach used**: Avoided multi-file tsc validation due to known TypeScript compilation issues
|
||||
- **Project-wide validation skipped**: `npm run type` not used during parallel migration to prevent false positives
|
||||
|
||||
## Migration Learnings
|
||||
- Type conflicts encountered: {describe any multiple type definitions}
|
||||
- Mock patterns used: {describe test mocking approaches}
|
||||
- Import hierarchy decisions: {note authoritative type sources used}
|
||||
- PropTypes strategy: {AUTO_GENERATED via babel plugin | MANUAL_DUPLICATION_REMOVED | N/A}
|
||||
|
||||
## Improvement Suggestions for Documentation
|
||||
- AGENT.md enhancement: {suggest additions to migration guide}
|
||||
- Common pattern identified: {note reusable patterns for future migrations}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Dependency Block Report Format
|
||||
|
||||
```
|
||||
DEPENDENCY_BLOCK: Cannot migrate {filename}
|
||||
|
||||
## Blocking Dependencies
|
||||
- {path}: {type} - {usage} - {priority}
|
||||
|
||||
## Impact Analysis
|
||||
- Estimated types: {number}
|
||||
- Expected locations: {list}
|
||||
- Cross-domain: {YES/NO}
|
||||
|
||||
## Recommended Order
|
||||
{ordered-list}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📚 Quick Reference
|
||||
|
||||
**Type Utilities:**
|
||||
- `Record<K, V>` - Object with specific key/value types
|
||||
- `Partial<T>` - All properties optional
|
||||
- `Pick<T, K>` - Subset of properties
|
||||
- `Omit<T, K>` - Exclude specific properties
|
||||
- `NonNullable<T>` - Exclude null/undefined
|
||||
|
||||
**Event Types:**
|
||||
- `MouseEvent<HTMLButtonElement>`
|
||||
- `ChangeEvent<HTMLInputElement>`
|
||||
- `FormEvent<HTMLFormElement>`
|
||||
|
||||
**React Types:**
|
||||
- `FC<Props>` - Functional component
|
||||
- `ReactNode` - Any renderable content
|
||||
- `CSSProperties` - Style objects
|
||||
|
||||
---
|
||||
|
||||
**Remember:** Every type should add value and clarity. The goal is meaningful type safety that catches bugs and improves developer experience.
|
||||
@@ -1,199 +0,0 @@
|
||||
# JS-to-TS Coordinator Workflow
|
||||
|
||||
**Role:** Strategic migration coordination - select leaf-node files, trigger agents, review results, handle integration, manage dependencies.
|
||||
|
||||
---
|
||||
|
||||
## 1. Core File Selection Strategy
|
||||
|
||||
**Target ONLY Core Files**: Coordinators identify core files (production code), agents handle related tests/mocks atomically.
|
||||
|
||||
**File Analysis Commands**:
|
||||
```bash
|
||||
# Find CORE files with no JS/JSX dependencies (exclude tests/mocks) - SIZE PRIORITIZED
|
||||
find superset-frontend/src -name "*.js" -o -name "*.jsx" | grep -v "test\|spec\|mock" | xargs wc -l | sort -n | head -20
|
||||
|
||||
# Alternative: Get file sizes in lines with paths
|
||||
find superset-frontend/src -name "*.js" -o -name "*.jsx" | grep -v "test\|spec\|mock" | while read file; do
|
||||
lines=$(wc -l < "$file")
|
||||
echo "$lines $file"
|
||||
done | sort -n | head -20
|
||||
|
||||
# Check dependencies for core files only (start with smallest)
|
||||
for file in <core-files-sorted-by-size>; do
|
||||
echo "=== $file ($(wc -l < "$file") lines) ==="
|
||||
grep -E "from '\.\./.*\.jsx?'|from '\./.*\.jsx?'|from 'src/.*\.jsx?'" "$file" || echo "✅ LEAF CANDIDATE"
|
||||
done
|
||||
|
||||
# Identify heavily imported files (migrate last)
|
||||
grep -r "from.*utils/common" superset-frontend/src/ | wc -l
|
||||
|
||||
# Quick leaf analysis with size priority
|
||||
find superset-frontend/src -name "*.js" -o -name "*.jsx" | grep -v "test\|spec\|mock" | head -30 | while read file; do
|
||||
deps=$(grep -E "from '\.\./.*\.jsx?'|from '\./.*\.jsx?'|from 'src/.*\.jsx?'" "$file" | wc -l)
|
||||
lines=$(wc -l < "$file")
|
||||
if [ "$deps" -eq 0 ]; then
|
||||
echo "✅ LEAF: $lines lines - $file"
|
||||
fi
|
||||
done | sort -n
|
||||
```
|
||||
|
||||
**Priority Order** (Smallest files first for easier wins):
|
||||
1. **Small leaf files** (<50 lines) - No JS/JSX imports, quick TypeScript conversion
|
||||
2. **Medium leaf files** (50-200 lines) - Self-contained utilities and helpers
|
||||
3. **Small dependency files** (<100 lines) - Import only already-migrated files
|
||||
4. **Larger components** (200+ lines) - Complex but well-contained functionality
|
||||
5. **Core foundational files** (utils/common.js, controls.jsx) - migrate last regardless of size
|
||||
|
||||
**Size-First Benefits**:
|
||||
- Faster completion builds momentum
|
||||
- Earlier validation of migration patterns
|
||||
- Easier rollback if issues arise
|
||||
- Better success rate for agent learning
|
||||
|
||||
**Migration Unit**: Each agent call migrates:
|
||||
- 1 core file (primary target)
|
||||
- All related `*.test.js/jsx` files
|
||||
- All related `*.mock.js` files
|
||||
- All related `__mocks__/` files
|
||||
|
||||
---
|
||||
|
||||
## 2. Task Creation & Agent Control
|
||||
|
||||
### Task Triggering
|
||||
When triggering the `/js-to-ts` command:
|
||||
- **Task Title**: Use the core filename as the task title (e.g., "DebouncedMessageQueue.js migration", "hostNamesConfig.js migration")
|
||||
- **Task Description**: Include the full relative path to help agent locate the file
|
||||
- **Reference**: Point agent to [AGENT.md](./AGENT.md) for technical instructions
|
||||
|
||||
### Post-Processing Workflow
|
||||
After each agent completes:
|
||||
|
||||
1. **Review Agent Report**: Always read and analyze the complete agent report
|
||||
2. **Share Summary**: Provide user with key highlights from agent's work:
|
||||
- Files migrated (core + tests/mocks)
|
||||
- Types created or improved
|
||||
- Any validation issues or coordinator actions needed
|
||||
3. **Quality Assessment**: Evaluate agent's TypeScript implementation against criteria:
|
||||
- ✅ **Type Usage**: Proper types used, no `any` types
|
||||
- ✅ **Type Filing**: Types placed in correct hierarchy (component → feature → domain → global)
|
||||
- ✅ **Side Effects**: No unintended changes to other files
|
||||
- ✅ **Import Alignment**: Proper .ts/.tsx import extensions
|
||||
4. **Integration Decision**:
|
||||
- **COMMIT**: If agent work is complete and high quality
|
||||
- **FIX & COMMIT**: If minor issues need coordinator fixes
|
||||
- **ROLLBACK**: If major issues require complete rework
|
||||
5. **Next Action**: Ask user preference - commit this work or trigger next migration
|
||||
|
||||
---
|
||||
|
||||
## 3. Integration Decision Framework
|
||||
|
||||
**Automatic Integration** ✅:
|
||||
- `npm run type` passes without errors
|
||||
- Agent created clean TypeScript with proper types
|
||||
- Types appropriately filed in hierarchy
|
||||
|
||||
**Coordinator Integration** (Fix Side-Effects) 🔧:
|
||||
- `npm run type` fails BUT agent's work is high quality
|
||||
- Good type usage, proper patterns, well-organized
|
||||
- Side-effects are manageable TypeScript compilation errors
|
||||
- **Coordinator Action**: Integrate the change, then fix global compilation issues
|
||||
|
||||
**Rollback Only** ❌:
|
||||
- Agent introduced `any` types or poor type choices
|
||||
- Types poorly organized or conflicting with existing patterns
|
||||
- Fundamental approach issues requiring complete rework
|
||||
|
||||
**Integration Process**:
|
||||
1. **Review**: Agent already used `git mv` to preserve history
|
||||
2. **Fix Side-Effects**: Update dependent files with proper import extensions
|
||||
3. **Resolve Types**: Fix any cascading type issues across codebase
|
||||
4. **Validate**: Ensure `npm run type` passes after fixes
|
||||
|
||||
---
|
||||
|
||||
## 4. Common Integration Patterns
|
||||
|
||||
**Common Side-Effects (Expect These)**:
|
||||
- **Type import conflicts**: Multiple definitions of same type name
|
||||
- **Mock object typing**: Tests need complete type satisfaction
|
||||
- **Stub method references**: Use stub vars instead of original methods
|
||||
|
||||
**Coordinator Fixes (Standard Process)**:
|
||||
1. **Import Resolution**:
|
||||
```bash
|
||||
# Find authoritative type source
|
||||
grep -r "TypeName" src/*/types.ts
|
||||
# Import from domain types (src/dashboard/types.ts) not component types
|
||||
```
|
||||
|
||||
2. **Test Mock Completion**:
|
||||
```typescript
|
||||
// Use Partial<T> as T pattern for minimal mocking
|
||||
const mockDashboard: Partial<DashboardInfo> as DashboardInfo = {
|
||||
id: 123,
|
||||
json_metadata: '{}',
|
||||
};
|
||||
```
|
||||
|
||||
3. **Stub Reference Fixes**:
|
||||
```typescript
|
||||
// ✅ Use stub variable
|
||||
expect(postStub.callCount).toBe(1);
|
||||
// ❌ Don't use original method
|
||||
expect(SupersetClient.post.callCount).toBe(1);
|
||||
```
|
||||
|
||||
4. **Validation Commands**:
|
||||
```bash
|
||||
npm run type # TypeScript compilation
|
||||
npm test -- filename # Test functionality
|
||||
git status # Should show rename, not add/delete
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. File Categories for Planning
|
||||
|
||||
### Leaf Files (Start Here)
|
||||
**Self-contained files with minimal JS/JSX dependencies**:
|
||||
- Test files (80 files) - Usually only import the file being tested
|
||||
- Utility files without internal dependencies
|
||||
- Components importing only external libraries
|
||||
|
||||
### Heavily Imported Files (Migrate Last)
|
||||
**Core files that many others depend on**:
|
||||
- `utils/common.js` - Core utility functions
|
||||
- `utils/reducerUtils.js` - Redux helpers
|
||||
- `@superset-ui/core` equivalent files
|
||||
- Major state management files (`explore/store.js`, `dashboard/actions/`)
|
||||
|
||||
### Complex Components (Middle Priority)
|
||||
**Large files requiring careful type analysis**:
|
||||
- `components/Datasource/DatasourceEditor.jsx` (1,809 lines)
|
||||
- `explore/components/controls/AnnotationLayerControl/AnnotationLayer.jsx` (1,031 lines)
|
||||
- `explore/components/ExploreViewContainer/index.jsx` (911 lines)
|
||||
|
||||
---
|
||||
|
||||
## 6. Success Metrics & Continuous Improvement
|
||||
|
||||
**Per-File Gates**:
|
||||
- ✅ `npm run type` passes after each migration
|
||||
- ✅ Zero `any` types introduced
|
||||
- ✅ All imports properly typed
|
||||
- ✅ Types filed in correct hierarchy
|
||||
|
||||
**Linear Scheduling**:
|
||||
When agents report `DEPENDENCY_BLOCK`:
|
||||
- Queue dependencies in linear order
|
||||
- Process one file at a time to avoid conflicts
|
||||
- Handle cascading type changes between files
|
||||
|
||||
**After Each Migration**:
|
||||
1. **Update guides** with new patterns discovered
|
||||
2. **Document coordinator fixes** that become common
|
||||
3. **Enhance agent instructions** based on recurring issues
|
||||
4. **Track success metrics** - automatic vs coordinator integration rates
|
||||
@@ -1,76 +0,0 @@
|
||||
# JavaScript to TypeScript Migration Project
|
||||
|
||||
Progressive migration of 219 JS/JSX files to TypeScript in Apache Superset frontend.
|
||||
|
||||
## 📁 Project Documentation
|
||||
|
||||
- **[AGENT.md](./AGENT.md)** - Complete technical migration guide for agents (includes type reference, patterns, validation)
|
||||
- **[COORDINATOR.md](./COORDINATOR.md)** - Strategic workflow for coordinators (file selection, task management, integration)
|
||||
|
||||
## 🎯 Quick Start
|
||||
|
||||
**For Agents:** Read [AGENT.md](./AGENT.md) for complete migration instructions
|
||||
**For Coordinators:** Read [COORDINATOR.md](./COORDINATOR.md) for workflow and [AGENT.md](./AGENT.md) for supervision
|
||||
|
||||
**Command:** `/js-to-ts <filename>` - See [../../commands/js-to-ts.md](../../commands/js-to-ts.md)
|
||||
|
||||
## 📊 Migration Progress
|
||||
|
||||
**Scope**: 219 files total (112 JS + 107 JSX)
|
||||
- Production files: 139 (63%)
|
||||
- Test files: 80 (37%)
|
||||
|
||||
**Strategy**: Leaf-first migration with dependency-aware coordination
|
||||
|
||||
### Completed Migrations ✅
|
||||
|
||||
1. **roundDecimal** - `plugins/legacy-plugin-chart-map-box/src/utils/roundDecimal.js`
|
||||
- Migrated core + test files
|
||||
- Added proper TypeScript function signature with optional precision parameter
|
||||
- All tests pass
|
||||
|
||||
2. **timeGrainSqlaAnimationOverrides** - `src/explore/controlPanels/timeGrainSqlaAnimationOverrides.js`
|
||||
- Migrated to TypeScript with ControlPanelState and Dataset types
|
||||
- Added TimeGrainOverrideState interface for return type
|
||||
- Used type guards for safe property access
|
||||
|
||||
3. **DebouncedMessageQueue** - `src/utils/DebouncedMessageQueue.js`
|
||||
- Migrated to TypeScript with proper generics
|
||||
- Created DebouncedMessageQueueOptions interface
|
||||
- **CREATED test file** with 4 comprehensive test cases
|
||||
- Excellent class property typing with private/readonly modifiers
|
||||
|
||||
**Files Migrated**: 3/219 (1.4%)
|
||||
**Tests Created**: 2 (roundDecimal had existing, DebouncedMessageQueue created)
|
||||
|
||||
### Next Candidates (Leaf Nodes) 🎯
|
||||
|
||||
**Identified leaf files with no JS/JSX dependencies:**
|
||||
- `src/utils/hostNamesConfig.js` - Domain configuration utility
|
||||
- `src/explore/controlPanels/Separator.js` - Control panel configuration
|
||||
- `src/middleware/loggerMiddleware.js` - Logging middleware
|
||||
|
||||
**Migration Quality**: All completed migrations have:
|
||||
- ✅ Zero `any` types
|
||||
- ✅ Proper TypeScript compilation
|
||||
- ✅ ESLint validation passed
|
||||
- ✅ Test coverage (created where missing)
|
||||
|
||||
---
|
||||
|
||||
## 📈 Success Metrics
|
||||
|
||||
**Per-File Gates**:
|
||||
- ✅ `npm run type` passes after each migration
|
||||
- ✅ Zero `any` types introduced
|
||||
- ✅ All imports properly typed
|
||||
- ✅ Types filed in correct hierarchy
|
||||
|
||||
**Overall Progress**:
|
||||
- **Automatic Integration Rate**: 100% (3/3 migrations required no coordinator fixes)
|
||||
- **Test Coverage**: Improved (1 new test file created)
|
||||
- **Type Safety**: Enhanced with proper interfaces and generics
|
||||
|
||||
---
|
||||
|
||||
*This is a claudette-managed progressive refactor. All documentation and coordination resources are organized under `.claude/projects/js-to-ts/`*
|
||||
24
.codecov.yml
24
.codecov.yml
@@ -1,35 +1,11 @@
|
||||
codecov:
|
||||
notify:
|
||||
after_n_builds: 4
|
||||
ignore:
|
||||
- "superset/migrations/versions/*.py"
|
||||
- "superset-frontend/packages/superset-ui-demo/**/*"
|
||||
- "**/*.stories.tsx"
|
||||
- "**/*.stories.jsx"
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default:
|
||||
informational: true
|
||||
# Commits pushed to master should not make the overall
|
||||
# project coverage decrease:
|
||||
target: auto
|
||||
threshold: 0%
|
||||
core-packages-ts:
|
||||
target: 100%
|
||||
paths:
|
||||
- 'superset-frontend/packages'
|
||||
- '!superset-frontend/packages/**/*.jsx'
|
||||
- '!superset-frontend/packages/**/*.tsx'
|
||||
core-packages-tsx:
|
||||
target: 50%
|
||||
paths:
|
||||
- 'superset-frontend/packages/**/*.jsx'
|
||||
- 'superset-frontend/packages/**/*.tsx'
|
||||
patch:
|
||||
default:
|
||||
informational: true
|
||||
threshold: 0%
|
||||
flag_management:
|
||||
default_rules:
|
||||
carryforward: true
|
||||
|
||||
36
.coveragerc
36
.coveragerc
@@ -1,36 +0,0 @@
|
||||
# .coveragerc to control coverage.py
|
||||
[run]
|
||||
branch = True
|
||||
source = superset
|
||||
# omit = bad_file.py
|
||||
|
||||
[paths]
|
||||
source =
|
||||
superset/
|
||||
*/site-packages/
|
||||
|
||||
[report]
|
||||
# Regexes for lines to exclude from consideration
|
||||
exclude_lines =
|
||||
# Have to re-enable the standard pragma
|
||||
pragma: no cover
|
||||
|
||||
# Don't complain about missing debug-only code:
|
||||
def __repr__
|
||||
if self\.debug
|
||||
|
||||
# Don't complain if tests don't hit defensive assertion code:
|
||||
raise AssertionError
|
||||
raise NotImplementedError
|
||||
|
||||
# Don't complain if non-runnable code isn't run:
|
||||
if 0:
|
||||
if __name__ == .__main__.:
|
||||
|
||||
# Ignore importlib backport
|
||||
from importlib
|
||||
|
||||
if TYPE_CHECKING:
|
||||
|
||||
#fail_under = 100
|
||||
show_missing = True
|
||||
@@ -1,125 +0,0 @@
|
||||
---
|
||||
description: Apache Superset development standards and guidelines for Cursor IDE
|
||||
globs: ["**/*.py", "**/*.ts", "**/*.tsx", "**/*.js", "**/*.jsx", "**/*.sql", "**/*.md"]
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Apache Superset Development Standards for Cursor IDE
|
||||
|
||||
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
|
||||
|
||||
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
|
||||
|
||||
**These migrations are actively happening - avoid deprecated patterns:**
|
||||
|
||||
### Frontend Modernization
|
||||
- **NO `any` types** - Use proper TypeScript types
|
||||
- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx)
|
||||
- **NO Enzyme** - Use React Testing Library/Jest (Enzyme fully removed)
|
||||
- **Use @superset-ui/core** - Don't import Ant Design directly
|
||||
|
||||
### Testing Strategy Migration
|
||||
- **Prefer unit tests** over integration tests
|
||||
- **Prefer integration tests** over Cypress end-to-end tests
|
||||
- **Cypress is last resort** - Actively moving away from Cypress
|
||||
- **Use Jest + React Testing Library** for component testing
|
||||
|
||||
### Backend Type Safety
|
||||
- **Add type hints** - All new Python code needs proper typing
|
||||
- **MyPy compliance** - Run `pre-commit run mypy` to validate
|
||||
- **SQLAlchemy typing** - Use proper model annotations
|
||||
|
||||
## Code Standards
|
||||
|
||||
### TypeScript Frontend
|
||||
- **NO `any` types** - Use proper TypeScript
|
||||
- **Functional components** with hooks
|
||||
- **@superset-ui/core** for UI components (not direct antd)
|
||||
- **Jest** for testing (NO Enzyme)
|
||||
- **Redux** for global state, hooks for local
|
||||
|
||||
### Python Backend
|
||||
- **Type hints required** for all new code
|
||||
- **MyPy compliant** - run `pre-commit run mypy`
|
||||
- **SQLAlchemy models** with proper typing
|
||||
- **pytest** for testing
|
||||
|
||||
### Apache License Headers
|
||||
- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header
|
||||
- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead
|
||||
|
||||
## Key Directory Structure
|
||||
|
||||
```
|
||||
superset/
|
||||
├── superset/ # Python backend (Flask, SQLAlchemy)
|
||||
│ ├── views/api/ # REST API endpoints
|
||||
│ ├── models/ # Database models
|
||||
│ └── connectors/ # Database connections
|
||||
├── superset-frontend/src/ # React TypeScript frontend
|
||||
│ ├── components/ # Reusable components
|
||||
│ ├── explore/ # Chart builder
|
||||
│ ├── dashboard/ # Dashboard interface
|
||||
│ └── SqlLab/ # SQL editor
|
||||
├── superset-frontend/packages/
|
||||
│ └── superset-ui-core/ # UI component library (USE THIS)
|
||||
├── tests/ # Python/integration tests
|
||||
├── docs/ # Documentation (UPDATE FOR CHANGES)
|
||||
└── UPDATING.md # Breaking changes log
|
||||
```
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
### Dataset-Centric Approach
|
||||
Charts built from enriched datasets containing:
|
||||
- Dimension columns with labels/descriptions
|
||||
- Predefined metrics as SQL expressions
|
||||
- Self-service analytics within defined contexts
|
||||
|
||||
### Security & Features
|
||||
- **RBAC**: Role-based access via Flask-AppBuilder
|
||||
- **Feature flags**: Control feature rollouts
|
||||
- **Row-level security**: SQL-based data access control
|
||||
|
||||
## Test Utilities
|
||||
|
||||
### Python Test Helpers
|
||||
- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py`
|
||||
- **`@with_config`** - Config mocking decorator
|
||||
- **`@with_feature_flags`** - Feature flag testing
|
||||
- **`login_as()`, `login_as_admin()`** - Authentication helpers
|
||||
- **`create_dashboard()`, `create_slice()`** - Data setup utilities
|
||||
|
||||
### TypeScript Test Helpers
|
||||
- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers
|
||||
- **`createWrapper()`** - Redux/Router/Theme wrapper
|
||||
- **`selectOption()`** - Select component helper
|
||||
- **React Testing Library** - NO Enzyme (removed)
|
||||
|
||||
## Pre-commit Validation
|
||||
|
||||
**Use pre-commit hooks for quality validation:**
|
||||
|
||||
```bash
|
||||
# Install hooks
|
||||
pre-commit install
|
||||
|
||||
# Quick validation (faster than --all-files)
|
||||
pre-commit run # Staged files only
|
||||
pre-commit run mypy # Python type checking
|
||||
pre-commit run prettier # Code formatting
|
||||
pre-commit run eslint # Frontend linting
|
||||
```
|
||||
|
||||
## Development Guidelines
|
||||
|
||||
- **Documentation**: Update docs/ for any user-facing changes
|
||||
- **Breaking Changes**: Add to UPDATING.md
|
||||
- **Docstrings**: Required for new functions/classes
|
||||
- **Follow existing patterns**: Mimic code style, use existing libraries and utilities
|
||||
- **Type Safety**: This codebase is actively modernizing toward full TypeScript and type safety
|
||||
- **Always run `pre-commit run`** to validate changes before committing
|
||||
|
||||
---
|
||||
|
||||
**Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns.
|
||||
@@ -1,20 +0,0 @@
|
||||
# Keep this in sync with the base image in the main Dockerfile (ARG PY_VER)
|
||||
FROM python:3.11.13-trixie AS base
|
||||
|
||||
# Install system dependencies that Superset needs
|
||||
# This layer will be cached across Codespace sessions
|
||||
RUN apt-get update && apt-get install -y \
|
||||
libsasl2-dev \
|
||||
libldap2-dev \
|
||||
libpq-dev \
|
||||
tmux \
|
||||
gh \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install uv for fast Python package management
|
||||
# This will also be cached in the image
|
||||
RUN curl -LsSf https://astral.sh/uv/install.sh | sh && \
|
||||
echo 'export PATH="/root/.cargo/bin:$PATH"' >> /etc/bash.bashrc
|
||||
|
||||
# Set the cargo/bin directory in PATH for all users
|
||||
ENV PATH="/root/.cargo/bin:${PATH}"
|
||||
@@ -1,16 +0,0 @@
|
||||
# Superset Development with GitHub Codespaces
|
||||
|
||||
For complete documentation on using GitHub Codespaces with Apache Superset, please see:
|
||||
|
||||
**[Setting up a Development Environment - GitHub Codespaces](https://superset.apache.org/docs/contributing/development#github-codespaces-cloud-development)**
|
||||
|
||||
## Pre-installed Development Environment
|
||||
|
||||
When you create a new Codespace from this repository, it automatically:
|
||||
|
||||
1. **Creates a Python virtual environment** using `uv venv`
|
||||
2. **Installs all development dependencies** via `uv pip install -r requirements/development.txt`
|
||||
3. **Sets up pre-commit hooks** with `pre-commit install`
|
||||
4. **Activates the virtual environment** automatically in all terminals
|
||||
|
||||
The virtual environment is located at `/workspaces/{repository-name}/.venv` and is automatically activated through environment variables set in the devcontainer configuration.
|
||||
@@ -1,62 +0,0 @@
|
||||
# Superset Codespaces environment setup
|
||||
# This file is appended to ~/.bashrc during Codespace setup
|
||||
|
||||
# Find the workspace directory (handles both 'superset' and 'superset-2' names)
|
||||
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
|
||||
|
||||
if [ -n "$WORKSPACE_DIR" ]; then
|
||||
# Check if virtual environment exists
|
||||
if [ -d "$WORKSPACE_DIR/.venv" ]; then
|
||||
# Activate the virtual environment
|
||||
source "$WORKSPACE_DIR/.venv/bin/activate"
|
||||
echo "✅ Python virtual environment activated"
|
||||
|
||||
# Verify pre-commit is installed and set up
|
||||
if command -v pre-commit &> /dev/null; then
|
||||
echo "✅ pre-commit is available ($(pre-commit --version))"
|
||||
# Install git hooks if not already installed
|
||||
if [ -d "$WORKSPACE_DIR/.git" ] && [ ! -f "$WORKSPACE_DIR/.git/hooks/pre-commit" ]; then
|
||||
echo "🪝 Installing pre-commit hooks..."
|
||||
cd "$WORKSPACE_DIR" && pre-commit install
|
||||
fi
|
||||
else
|
||||
echo "⚠️ pre-commit not found. Run: pip install pre-commit"
|
||||
fi
|
||||
else
|
||||
echo "⚠️ Python virtual environment not found at $WORKSPACE_DIR/.venv"
|
||||
echo " Run: cd $WORKSPACE_DIR && .devcontainer/setup-dev.sh"
|
||||
fi
|
||||
|
||||
# Always cd to the workspace directory for convenience
|
||||
cd "$WORKSPACE_DIR"
|
||||
fi
|
||||
|
||||
# Add helpful aliases for Superset development
|
||||
alias start-superset="$WORKSPACE_DIR/.devcontainer/start-superset.sh"
|
||||
alias setup-dev="$WORKSPACE_DIR/.devcontainer/setup-dev.sh"
|
||||
|
||||
# Show helpful message on login
|
||||
echo ""
|
||||
echo "🚀 Superset Codespaces Environment"
|
||||
echo "=================================="
|
||||
|
||||
# Check if Superset is running
|
||||
if docker ps 2>/dev/null | grep -q "superset"; then
|
||||
echo "✅ Superset is running!"
|
||||
echo " - Check the 'Ports' tab for your live Superset URL"
|
||||
echo " - Initial startup takes 10-20 minutes"
|
||||
echo " - Login: admin/admin"
|
||||
else
|
||||
echo "⚠️ Superset is not running. Use: start-superset"
|
||||
# Check if there's a startup log
|
||||
if [ -f "/tmp/superset-startup.log" ]; then
|
||||
echo " 📋 Startup log found: cat /tmp/superset-startup.log"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "Quick commands:"
|
||||
echo " start-superset - Start Superset with Docker Compose"
|
||||
echo " setup-dev - Set up Python environment (if not already done)"
|
||||
echo " pre-commit run - Run pre-commit checks on staged files"
|
||||
echo ""
|
||||
@@ -1,20 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Script to build and push the devcontainer image to GitHub Container Registry
|
||||
# This allows caching the image between Codespace sessions
|
||||
|
||||
# You'll need to run this with appropriate GitHub permissions
|
||||
# gh auth login --scopes write:packages
|
||||
|
||||
REGISTRY="ghcr.io"
|
||||
OWNER="apache"
|
||||
REPO="superset"
|
||||
TAG="devcontainer-base"
|
||||
|
||||
echo "Building devcontainer image..."
|
||||
docker build -t $REGISTRY/$OWNER/$REPO:$TAG .devcontainer/
|
||||
|
||||
echo "Pushing to GitHub Container Registry..."
|
||||
docker push $REGISTRY/$OWNER/$REPO:$TAG
|
||||
|
||||
echo "Done! Update .devcontainer/devcontainer.json to use:"
|
||||
echo " \"image\": \"$REGISTRY/$OWNER/$REPO:$TAG\""
|
||||
@@ -1,66 +0,0 @@
|
||||
{
|
||||
"name": "Apache Superset Development",
|
||||
// Option 1: Use pre-built image directly
|
||||
// "image": "ghcr.io/apache/superset:devcontainer-base",
|
||||
|
||||
// Option 2: Build from Dockerfile with cache (current approach)
|
||||
"build": {
|
||||
"dockerfile": "Dockerfile",
|
||||
"context": ".",
|
||||
// Cache from the Apache registry image
|
||||
"cacheFrom": ["ghcr.io/apache/superset:devcontainer-base"]
|
||||
},
|
||||
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/docker-in-docker:2": {
|
||||
"moby": true,
|
||||
"dockerDashComposeVersion": "v2"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/node:1": {
|
||||
"version": "20"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/git:1": {},
|
||||
"ghcr.io/devcontainers/features/common-utils:2": {
|
||||
"configureZshAsDefaultShell": true
|
||||
},
|
||||
"ghcr.io/devcontainers/features/sshd:1": {
|
||||
"version": "latest"
|
||||
}
|
||||
},
|
||||
|
||||
// Forward ports for development
|
||||
"forwardPorts": [9001],
|
||||
"portsAttributes": {
|
||||
"9001": {
|
||||
"label": "Superset (via Webpack Dev Server)",
|
||||
"onAutoForward": "notify",
|
||||
"visibility": "public"
|
||||
}
|
||||
},
|
||||
|
||||
// Run commands after container is created
|
||||
"postCreateCommand": "bash .devcontainer/setup-dev.sh || echo '⚠️ Setup had issues - run .devcontainer/setup-dev.sh manually'",
|
||||
|
||||
// Auto-start Superset after ensuring Docker is ready
|
||||
// Run in foreground to see any errors, but don't block on failures
|
||||
"postStartCommand": "bash -c 'echo \"Waiting 30s for services to initialize...\"; sleep 30; .devcontainer/start-superset.sh || echo \"⚠️ Auto-start failed - run start-superset manually\"'",
|
||||
|
||||
// Set environment variables
|
||||
"remoteEnv": {
|
||||
// Removed automatic venv activation to prevent startup issues
|
||||
// The setup script will handle this
|
||||
},
|
||||
|
||||
// VS Code customizations
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"extensions": [
|
||||
"ms-python.python",
|
||||
"ms-python.vscode-pylance",
|
||||
"charliermarsh.ruff",
|
||||
"dbaeumer.vscode-eslint",
|
||||
"esbenp.prettier-vscode"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,78 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Setup script for Superset Codespaces development environment
|
||||
|
||||
echo "🔧 Setting up Superset development environment..."
|
||||
|
||||
# System dependencies and uv are now pre-installed in the Docker image
|
||||
# This speeds up Codespace creation significantly!
|
||||
|
||||
# Create virtual environment using uv
|
||||
echo "🐍 Creating Python virtual environment..."
|
||||
if ! uv venv; then
|
||||
echo "❌ Failed to create virtual environment"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Install Python dependencies
|
||||
echo "📦 Installing Python dependencies..."
|
||||
if ! uv pip install -r requirements/development.txt; then
|
||||
echo "❌ Failed to install Python dependencies"
|
||||
echo "💡 You may need to run this manually after the Codespace starts"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Install pre-commit hooks
|
||||
echo "🪝 Installing pre-commit hooks..."
|
||||
if source .venv/bin/activate && pre-commit install; then
|
||||
echo "✅ Pre-commit hooks installed"
|
||||
else
|
||||
echo "⚠️ Pre-commit hooks installation failed (non-critical)"
|
||||
fi
|
||||
|
||||
# Install Claude Code CLI via npm
|
||||
echo "🤖 Installing Claude Code..."
|
||||
if npm install -g @anthropic-ai/claude-code; then
|
||||
echo "✅ Claude Code installed"
|
||||
else
|
||||
echo "⚠️ Claude Code installation failed (non-critical)"
|
||||
fi
|
||||
|
||||
# Make the start script executable
|
||||
chmod +x .devcontainer/start-superset.sh
|
||||
|
||||
# Add bashrc additions for automatic venv activation
|
||||
echo "🔧 Setting up automatic environment activation..."
|
||||
if [ -f ~/.bashrc ]; then
|
||||
# Check if we've already added our additions
|
||||
if ! grep -q "Superset Codespaces environment setup" ~/.bashrc; then
|
||||
echo "" >> ~/.bashrc
|
||||
cat .devcontainer/bashrc-additions >> ~/.bashrc
|
||||
echo "✅ Added automatic venv activation to ~/.bashrc"
|
||||
else
|
||||
echo "✅ Bashrc additions already present"
|
||||
fi
|
||||
else
|
||||
# Create bashrc if it doesn't exist
|
||||
cat .devcontainer/bashrc-additions > ~/.bashrc
|
||||
echo "✅ Created ~/.bashrc with automatic venv activation"
|
||||
fi
|
||||
|
||||
# Also add to zshrc since that's the default shell
|
||||
if [ -f ~/.zshrc ] || [ -n "$ZSH_VERSION" ]; then
|
||||
if ! grep -q "Superset Codespaces environment setup" ~/.zshrc; then
|
||||
echo "" >> ~/.zshrc
|
||||
cat .devcontainer/bashrc-additions >> ~/.zshrc
|
||||
echo "✅ Added automatic venv activation to ~/.zshrc"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "✅ Development environment setup complete!"
|
||||
echo ""
|
||||
echo "📝 The virtual environment will be automatically activated in new terminals"
|
||||
echo ""
|
||||
echo "🔄 To activate in this terminal, run:"
|
||||
echo " source ~/.bashrc"
|
||||
echo ""
|
||||
echo "🚀 To start Superset:"
|
||||
echo " start-superset"
|
||||
echo ""
|
||||
@@ -1,108 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Startup script for Superset in Codespaces
|
||||
|
||||
# Log to a file for debugging
|
||||
LOG_FILE="/tmp/superset-startup.log"
|
||||
echo "[$(date)] Starting Superset startup script" >> "$LOG_FILE"
|
||||
echo "[$(date)] User: $(whoami), PWD: $(pwd)" >> "$LOG_FILE"
|
||||
|
||||
echo "🚀 Starting Superset in Codespaces..."
|
||||
echo "🌐 Frontend will be available at port 9001"
|
||||
|
||||
# Find the workspace directory (Codespaces clones as 'superset', not 'superset-2')
|
||||
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
|
||||
if [ -n "$WORKSPACE_DIR" ]; then
|
||||
cd "$WORKSPACE_DIR"
|
||||
echo "📁 Working in: $WORKSPACE_DIR"
|
||||
else
|
||||
echo "📁 Using current directory: $(pwd)"
|
||||
fi
|
||||
|
||||
# Wait for Docker to be available
|
||||
echo "⏳ Waiting for Docker to start..."
|
||||
echo "[$(date)] Waiting for Docker..." >> "$LOG_FILE"
|
||||
max_attempts=30
|
||||
attempt=0
|
||||
while ! docker info > /dev/null 2>&1; do
|
||||
if [ $attempt -eq $max_attempts ]; then
|
||||
echo "❌ Docker failed to start after $max_attempts attempts"
|
||||
echo "[$(date)] Docker failed to start after $max_attempts attempts" >> "$LOG_FILE"
|
||||
echo "🔄 Please restart the Codespace or run this script manually later"
|
||||
exit 1
|
||||
fi
|
||||
echo " Attempt $((attempt + 1))/$max_attempts..."
|
||||
echo "[$(date)] Docker check attempt $((attempt + 1))/$max_attempts" >> "$LOG_FILE"
|
||||
sleep 2
|
||||
attempt=$((attempt + 1))
|
||||
done
|
||||
echo "✅ Docker is ready!"
|
||||
echo "[$(date)] Docker is ready" >> "$LOG_FILE"
|
||||
|
||||
# Check if Superset containers are already running
|
||||
if docker ps | grep -q "superset"; then
|
||||
echo "✅ Superset containers are already running!"
|
||||
echo ""
|
||||
echo "🌐 To access Superset:"
|
||||
echo " 1. Click the 'Ports' tab at the bottom of VS Code"
|
||||
echo " 2. Find port 9001 and click the globe icon to open"
|
||||
echo " 3. Wait 10-20 minutes for initial startup"
|
||||
echo ""
|
||||
echo "📝 Login credentials: admin/admin"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Clean up any existing containers
|
||||
echo "🧹 Cleaning up existing containers..."
|
||||
docker-compose -f docker-compose-light.yml down
|
||||
|
||||
# Start services
|
||||
echo "🏗️ Starting Superset in background (daemon mode)..."
|
||||
echo ""
|
||||
|
||||
# Start in detached mode
|
||||
docker-compose -f docker-compose-light.yml up -d
|
||||
|
||||
echo ""
|
||||
echo "✅ Docker Compose started successfully!"
|
||||
echo ""
|
||||
echo "📋 Important information:"
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo "⏱️ Initial startup takes 10-20 minutes"
|
||||
echo "🌐 Check the 'Ports' tab for your Superset URL (port 9001)"
|
||||
echo "👤 Login: admin / admin"
|
||||
echo ""
|
||||
echo "📊 Useful commands:"
|
||||
echo " docker-compose -f docker-compose-light.yml logs -f # Follow logs"
|
||||
echo " docker-compose -f docker-compose-light.yml ps # Check status"
|
||||
echo " docker-compose -f docker-compose-light.yml down # Stop services"
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo ""
|
||||
echo "💤 Keeping terminal open for 60 seconds to test persistence..."
|
||||
sleep 60
|
||||
echo "✅ Test complete - check if this terminal is still visible!"
|
||||
|
||||
# Show final status
|
||||
docker-compose -f docker-compose-light.yml ps
|
||||
EXIT_CODE=$?
|
||||
|
||||
# If it failed, provide helpful instructions
|
||||
if [ $EXIT_CODE -ne 0 ] && [ $EXIT_CODE -ne 130 ]; then # 130 is Ctrl+C
|
||||
echo ""
|
||||
echo "❌ Superset startup failed (exit code: $EXIT_CODE)"
|
||||
echo ""
|
||||
echo "🔄 To restart Superset, run:"
|
||||
echo " .devcontainer/start-superset.sh"
|
||||
echo ""
|
||||
echo "🔧 For troubleshooting:"
|
||||
echo " # View logs:"
|
||||
echo " docker-compose -f docker-compose-light.yml logs"
|
||||
echo ""
|
||||
echo " # Clean restart (removes volumes):"
|
||||
echo " docker-compose -f docker-compose-light.yml down -v"
|
||||
echo " .devcontainer/start-superset.sh"
|
||||
echo ""
|
||||
echo " # Common issues:"
|
||||
echo " - Network timeouts: Just retry, often transient"
|
||||
echo " - Port conflicts: Check 'docker ps'"
|
||||
echo " - Database issues: Try clean restart with -v"
|
||||
fi
|
||||
@@ -15,9 +15,6 @@
|
||||
# limitations under the License.
|
||||
#
|
||||
**/__pycache__/
|
||||
**/.git
|
||||
**/.apache_superset.egg-info
|
||||
**/.github
|
||||
**/.mypy_cache
|
||||
**/.pytest_cache
|
||||
**/.tox
|
||||
@@ -33,16 +30,11 @@
|
||||
**/*.pyc
|
||||
**/*.sqllite
|
||||
**/*.swp
|
||||
**/.terser-plugin-cache/
|
||||
**/node_modules/
|
||||
|
||||
tests/
|
||||
docs/
|
||||
install/
|
||||
superset-frontend/cypress-base/
|
||||
superset-frontend/node_modules/
|
||||
superset-frontend/cypress/
|
||||
superset-frontend/coverage/
|
||||
superset-frontend/.temp_cache/
|
||||
superset/static/assets/
|
||||
superset-websocket/dist/
|
||||
venv
|
||||
.venv
|
||||
|
||||
@@ -15,4 +15,4 @@
|
||||
# limitations under the License.
|
||||
#
|
||||
FLASK_APP="superset.app:create_app()"
|
||||
FLASK_DEBUG=true
|
||||
FLASK_ENV="development"
|
||||
|
||||
4
.gitattributes
vendored
4
.gitattributes
vendored
@@ -1,4 +0,0 @@
|
||||
docker/**/*.sh text eol=lf
|
||||
*.svg binary
|
||||
*.ipynb binary
|
||||
*.geojson binary
|
||||
42
.github/CODEOWNERS
vendored
42
.github/CODEOWNERS
vendored
@@ -1,42 +0,0 @@
|
||||
# Notify all committers of DB migration changes, per SIP-59
|
||||
|
||||
# https://github.com/apache/superset/issues/13351
|
||||
|
||||
/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho @sadpandajoe
|
||||
|
||||
# Notify some committers of changes in the components
|
||||
|
||||
/superset-frontend/src/components/Select/ @michael-s-molina @geido @kgabryje
|
||||
/superset-frontend/src/components/MetadataBar/ @michael-s-molina @geido @kgabryje
|
||||
/superset-frontend/src/components/DropdownContainer/ @michael-s-molina @geido @kgabryje
|
||||
|
||||
# Notify Helm Chart maintainers about changes in it
|
||||
|
||||
/helm/superset/ @craig-rueda @dpgaspar @villebro @nytai @michael-s-molina @mistercrunch @rusackas @Antonio-RiveroMartnez
|
||||
|
||||
# Notify E2E test maintainers of changes
|
||||
|
||||
/superset-frontend/cypress-base/ @sadpandajoe @geido @eschutho @rusackas @betodealmeida @mistercrunch
|
||||
|
||||
# Notify PMC members of changes to GitHub Actions
|
||||
|
||||
/.github/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar
|
||||
|
||||
# Notify PMC members of changes to required GitHub Actions
|
||||
|
||||
/.asf.yaml @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @Antonio-RiveroMartnez
|
||||
|
||||
# Maps are a finicky contribution process we care about
|
||||
|
||||
**/*.geojson @villebro @rusackas
|
||||
/superset-frontend/plugins/legacy-plugin-chart-country-map/ @villebro @rusackas
|
||||
|
||||
# Notify PMC members of changes to extension-related files
|
||||
|
||||
/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset-extensions-cli/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset/extensions/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset-frontend/src/packages/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset-frontend/src/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset-frontend/src/extensions/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
99
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
99
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -1,99 +0,0 @@
|
||||
name: Bug report
|
||||
description: Report a bug to improve Superset's stability
|
||||
labels: ["bug"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Hello Superset Community member! Please keep things tidy by putting your post in the proper place:
|
||||
|
||||
🚨 Reporting a security issue: send an email to security@superset.apache.org. DO NOT USE GITHUB ISSUES TO REPORT SECURITY PROBLEMS.
|
||||
🐛 Reporting a bug: use this form.
|
||||
🙏 Asking a question or getting help: post in the [Superset Slack chat](http://bit.ly/join-superset-slack) or [GitHub Discussions](https://github.com/apache/superset/discussions) under "Q&A / Help".
|
||||
💡 Requesting a new feature: Search [GitHub Discussions](https://github.com/apache/superset/discussions) to see if it exists already. If not, add a new post there under "Ideas".
|
||||
- type: textarea
|
||||
id: bug-description
|
||||
attributes:
|
||||
label: Bug description
|
||||
description: A clear description of what the bug is, including reproduction steps and expected behavior.
|
||||
placeholder: |
|
||||
The bug is that...
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: screenshots-recordings
|
||||
attributes:
|
||||
label: Screenshots/recordings
|
||||
description: If applicable, add screenshots or recordings to help explain your problem.
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
### Environment
|
||||
|
||||
Please specify your environment. If your environment does not match the alternatives, you need to upgrade your environment before submitting the issue as it may have already been fixed. For additional information about the releases, see [Release Process](https://github.com/apache/superset/wiki/Release-Process).
|
||||
- type: dropdown
|
||||
id: superset-version
|
||||
attributes:
|
||||
label: Superset version
|
||||
options:
|
||||
- master / latest-dev
|
||||
- "5.0.0"
|
||||
- "4.1.3"
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: python-version
|
||||
attributes:
|
||||
label: Python version
|
||||
options:
|
||||
- "3.9"
|
||||
- "3.10"
|
||||
- "3.11"
|
||||
- Not applicable
|
||||
- I don't know
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: node-version
|
||||
attributes:
|
||||
label: Node version
|
||||
options:
|
||||
- "16"
|
||||
- "17"
|
||||
- "18 or greater"
|
||||
- Not applicable
|
||||
- I don't know
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: browser
|
||||
attributes:
|
||||
label: Browser
|
||||
options:
|
||||
- Chrome
|
||||
- Firefox
|
||||
- Safari
|
||||
- Not applicable
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: additional-context
|
||||
attributes:
|
||||
label: Additional context
|
||||
description: |
|
||||
Add any other context about the problem here such as the feature flags that you have enabled, any customizations you have made, the data source you are querying, etc.
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
label: Checklist
|
||||
description: Make sure to follow these steps before submitting your issue - thank you!
|
||||
options:
|
||||
- label: I have searched Superset docs and Slack and didn't find a solution to my problem.
|
||||
- label: I have searched the GitHub issue tracker and didn't find a similar bug report.
|
||||
- label: I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.
|
||||
validations:
|
||||
required: true
|
||||
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
labels: "#bug"
|
||||
|
||||
---
|
||||
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
### Expected results
|
||||
|
||||
what you expected to happen.
|
||||
|
||||
### Actual results
|
||||
|
||||
what actually happens.
|
||||
|
||||
#### Screenshots
|
||||
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
#### How to reproduce the bug
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
|
||||
### Environment
|
||||
|
||||
(please complete the following information):
|
||||
|
||||
- superset version: `superset version`
|
||||
- python version: `python --version`
|
||||
- node.js version: `node -v`
|
||||
|
||||
### Checklist
|
||||
|
||||
Make sure to follow these steps before submitting your issue - thank you!
|
||||
|
||||
- [ ] I have checked the superset logs for python stacktraces and included it here as text if there are any.
|
||||
- [ ] I have reproduced the issue with at least the latest released version of superset.
|
||||
- [ ] I have checked the issue tracker for the same issue and I haven't found one similar.
|
||||
|
||||
### Additional context
|
||||
|
||||
Add any other context about the problem here.
|
||||
12
.github/ISSUE_TEMPLATE/config.yml
vendored
12
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -1,12 +0,0 @@
|
||||
---
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: Feature Request
|
||||
url: https://github.com/apache/superset/discussions/new?category=ideas
|
||||
about: Propose a feature request to the Superset community
|
||||
- name: Q&A
|
||||
url: https://github.com/apache/superset/discussions/new?category=q-a-help
|
||||
about: Open a community Q&A thread on GitHub Discussions
|
||||
- name: Slack
|
||||
url: https://bit.ly/join-superset-slack
|
||||
about: Join the Superset Community on Slack for other discussions and assistance
|
||||
1
.github/ISSUE_TEMPLATE/cosmetic.md
vendored
1
.github/ISSUE_TEMPLATE/cosmetic.md
vendored
@@ -2,6 +2,7 @@
|
||||
name: Cosmetic Issue
|
||||
about: Describe a cosmetic issue with CSS, positioning, layout, labeling, or similar
|
||||
labels: "cosmetic-issue"
|
||||
|
||||
---
|
||||
|
||||
## Screenshot
|
||||
|
||||
18
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
18
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
labels: "#enhancement"
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
12
.github/ISSUE_TEMPLATE/security_vulnerability.md
vendored
Normal file
12
.github/ISSUE_TEMPLATE/security_vulnerability.md
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
---
|
||||
name: Security vulnerability
|
||||
about: Report a security vulnerability or issue
|
||||
labels: "#security"
|
||||
|
||||
---
|
||||
|
||||
## DO NOT REPORT SECURITY VULNERABILITIES HERE
|
||||
|
||||
Please report security vulnerabilities to private@superset.apache.org.
|
||||
|
||||
In the event a community member discovers a security flaw in Superset, it is important to follow the [Apache Security Guidelines](https://www.apache.org/security/committers.html) and release a fix as quickly as possible before public disclosure. Reporting security vulnerabilities through the usual GitHub Issues channel is not ideal as it will publicize the flaw before a fix can be applied.
|
||||
11
.github/ISSUE_TEMPLATE/sip.md
vendored
11
.github/ISSUE_TEMPLATE/sip.md
vendored
@@ -1,15 +1,14 @@
|
||||
---
|
||||
name: SIP
|
||||
about: "Superset Improvement Proposal. See SIP-0 (https://github.com/apache/superset/issues/5602) for details. A SIP introduces any major change into Apache Superset's code or process."
|
||||
labels: sip
|
||||
title: "[SIP] Your Title Here (do not add SIP number)"
|
||||
assignees: "apache/superset-committers"
|
||||
about: Superset Improvement Proposal
|
||||
labels: "#SIP"
|
||||
|
||||
---
|
||||
|
||||
*Please make sure you are familiar with the SIP process documented*
|
||||
[here](https://github.com/apache/superset/issues/5602). The SIP will be numbered by a committer upon acceptance.
|
||||
(here)[https://github.com/apache/superset/issues/5602]
|
||||
|
||||
## [SIP] Proposal for ...<title>
|
||||
## [SIP] Proposal for XXX
|
||||
|
||||
### Motivation
|
||||
|
||||
|
||||
17
.github/PULL_REQUEST_TEMPLATE.md
vendored
17
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -1,27 +1,18 @@
|
||||
<!---
|
||||
Please write the PR title following the conventions at https://www.conventionalcommits.org/en/v1.0.0/
|
||||
Example:
|
||||
fix(dashboard): load charts correctly
|
||||
-->
|
||||
|
||||
### SUMMARY
|
||||
<!--- Describe the change below, including rationale and design decisions -->
|
||||
|
||||
### BEFORE/AFTER SCREENSHOTS OR ANIMATED GIF
|
||||
<!--- Skip this if not applicable -->
|
||||
|
||||
### TESTING INSTRUCTIONS
|
||||
<!--- Required! What steps can be taken to manually verify the changes? -->
|
||||
### TEST PLAN
|
||||
<!--- What steps should be taken to verify the changes -->
|
||||
|
||||
### ADDITIONAL INFORMATION
|
||||
<!--- Check any relevant boxes with "x" -->
|
||||
<!--- HINT: Include "Fixes #nnn" if you are fixing an existing issue -->
|
||||
- [ ] Has associated issue:
|
||||
- [ ] Required feature flags:
|
||||
- [ ] Changes UI
|
||||
- [ ] Includes DB Migration (follow approval process in [SIP-59](https://github.com/apache/superset/issues/13351))
|
||||
- [ ] Migration is atomic, supports rollback & is backwards-compatible
|
||||
- [ ] Confirm DB migration upgrade and downgrade tested
|
||||
- [ ] Runtime estimates and downtime expectations provided
|
||||
- [ ] Requires DB Migration.
|
||||
- [ ] Confirm DB Migration upgrade and downgrade tested.
|
||||
- [ ] Introduces new feature or API
|
||||
- [ ] Removes existing feature or API
|
||||
|
||||
38
.github/SECURITY.md
vendored
38
.github/SECURITY.md
vendored
@@ -1,38 +0,0 @@
|
||||
# Security Policy
|
||||
|
||||
This is a project of the [Apache Software Foundation](https://apache.org) and follows the
|
||||
ASF [vulnerability handling process](https://apache.org/security/#vulnerability-handling).
|
||||
|
||||
## Reporting Vulnerabilities
|
||||
|
||||
**⚠️ Please do not file GitHub issues for security vulnerabilities as they are public! ⚠️**
|
||||
|
||||
|
||||
Apache Software Foundation takes a rigorous standpoint in annihilating the security issues
|
||||
in its software projects. Apache Superset is highly sensitive and forthcoming to issues
|
||||
pertaining to its features and functionality.
|
||||
If you have any concern or believe you have found a vulnerability in Apache Superset,
|
||||
please get in touch with the Apache Superset Security Team privately at
|
||||
e-mail address [security@superset.apache.org](mailto:security@superset.apache.org).
|
||||
|
||||
More details can be found on the ASF website at
|
||||
[ASF vulnerability reporting process](https://apache.org/security/#reporting-a-vulnerability)
|
||||
|
||||
We kindly ask you to include the following information in your report:
|
||||
- Apache Superset version that you are using
|
||||
- A sanitized copy of your `superset_config.py` file or any config overrides
|
||||
- Detailed steps to reproduce the vulnerability
|
||||
|
||||
Note that Apache Superset is not responsible for any third-party dependencies that may
|
||||
have security issues. Any vulnerabilities found in third-party dependencies should be
|
||||
reported to the maintainers of those projects. Results from security scans of Apache
|
||||
Superset dependencies found on its official Docker image can be remediated at release time
|
||||
by extending the image itself.
|
||||
|
||||
**Your responsible disclosure and collaboration are invaluable.**
|
||||
|
||||
## Extra Information
|
||||
|
||||
- [Apache Superset documentation](https://superset.apache.org/docs/security)
|
||||
- [Common Vulnerabilities and Exposures by release](https://superset.apache.org/docs/security/cves)
|
||||
- [How Security Vulnerabilities are Reported & Handled in Apache Superset (Blog)](https://preset.io/blog/how-security-vulnerabilities-are-reported-and-handled-in-apache-superset/)
|
||||
1
.github/actions/cached-dependencies
vendored
1
.github/actions/cached-dependencies
vendored
Submodule .github/actions/cached-dependencies deleted from 064315d61e
1
.github/actions/cached-dependencies/.editorconfig
vendored
Normal file
1
.github/actions/cached-dependencies/.editorconfig
vendored
Normal file
@@ -0,0 +1 @@
|
||||
indent_size = 2
|
||||
3
.github/actions/cached-dependencies/.eslintignore
vendored
Normal file
3
.github/actions/cached-dependencies/.eslintignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
dist/
|
||||
lib/
|
||||
node_modules/
|
||||
26
.github/actions/cached-dependencies/.eslintrc.js
vendored
Normal file
26
.github/actions/cached-dependencies/.eslintrc.js
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
module.exports = {
|
||||
plugins: ['jest', '@typescript-eslint'],
|
||||
extends: ['plugin:jest/all'],
|
||||
parser: '@typescript-eslint/parser',
|
||||
parserOptions: {
|
||||
ecmaVersion: 9,
|
||||
sourceType: 'module',
|
||||
},
|
||||
rules: {
|
||||
'eslint-comments/no-use': 'off',
|
||||
'import/no-namespace': 'off',
|
||||
'no-unused-vars': 'off',
|
||||
'no-console': 'off',
|
||||
'jest/prefer-expect-assertions': 'off',
|
||||
'jest/no-disabled-tests': 'warn',
|
||||
'jest/no-focused-tests': 'error',
|
||||
'jest/no-identical-title': 'error',
|
||||
'jest/prefer-to-have-length': 'warn',
|
||||
'jest/valid-expect': 'error',
|
||||
},
|
||||
env: {
|
||||
node: true,
|
||||
es6: true,
|
||||
'jest/globals': true,
|
||||
},
|
||||
};
|
||||
34
.github/actions/cached-dependencies/.github/workflows/tests.yml
vendored
Normal file
34
.github/actions/cached-dependencies/.github/workflows/tests.yml
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
name: Tests
|
||||
on:
|
||||
pull_request:
|
||||
paths-ignore:
|
||||
- '**.md'
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
paths-ignore:
|
||||
- '**.md'
|
||||
jobs:
|
||||
test:
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macOS-latest]
|
||||
name: Test on ${{ matrix.os }}
|
||||
runs-on: ${{ matrix.os }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v1
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: '12.x'
|
||||
- name: Install dependencies
|
||||
run: npm ci
|
||||
- name: Run prettier format check
|
||||
run: npm run format-check
|
||||
- name: Build
|
||||
run: npm run build
|
||||
- name: Run tests
|
||||
run: npm run test
|
||||
- name: Upload code coverage
|
||||
run: |
|
||||
bash <(curl -s https://codecov.io/bash)
|
||||
6
.github/actions/cached-dependencies/.gitignore
vendored
Normal file
6
.github/actions/cached-dependencies/.gitignore
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
lib
|
||||
coverage
|
||||
node_modules
|
||||
|
||||
!dist
|
||||
!dist/cache
|
||||
3
.github/actions/cached-dependencies/.prettierignore
vendored
Normal file
3
.github/actions/cached-dependencies/.prettierignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
dist/
|
||||
lib/
|
||||
node_modules/
|
||||
11
.github/actions/cached-dependencies/.prettierrc.json
vendored
Normal file
11
.github/actions/cached-dependencies/.prettierrc.json
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"printWidth": 80,
|
||||
"tabWidth": 2,
|
||||
"useTabs": false,
|
||||
"semi": true,
|
||||
"singleQuote": true,
|
||||
"trailingComma": "all",
|
||||
"bracketSpacing": true,
|
||||
"arrowParens": "avoid",
|
||||
"parser": "typescript"
|
||||
}
|
||||
22
.github/actions/cached-dependencies/LICENSE
vendored
Normal file
22
.github/actions/cached-dependencies/LICENSE
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2018 GitHub, Inc. and contributors
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
212
.github/actions/cached-dependencies/README.md
vendored
Normal file
212
.github/actions/cached-dependencies/README.md
vendored
Normal file
@@ -0,0 +1,212 @@
|
||||
# cached-dependencies
|
||||
|
||||
[](https://github.com/ktmud/cached-dependencies/actions?query=workflow%3ATests) [](https://codecov.io/gh/ktmud/cached-dependencies)
|
||||
|
||||
Enable **multi-layer cache** and **shortcut commands** in any workflows.
|
||||
|
||||
Manage multiple cache targets in one step. Use either the built-in cache configs for npm, yarn, and pip, or write your own. Create a bash command library to easily reduce redudencies across workflows. Most useful for building webapps that require multi-stage building processes.
|
||||
|
||||
This is your all-in-one action for everything related to setting up dependencies with cache.
|
||||
|
||||
## Inputs
|
||||
|
||||
- **run**: bash commands to run, allows shortcut commands
|
||||
- **caches**: path to a JS module that defines cache targets, defaults to `.github/workflows/caches.js`
|
||||
- **bashlib**: path to a BASH scripts that defines shortcut commands, defaults to `.github/workflows/bashlib.sh`
|
||||
- **parallel**: whether to run the commands in parallel with node subprocesses
|
||||
|
||||
## Examples
|
||||
|
||||
Following workflow sets up dependencies for a typical Python web app with both `~/.pip` and `~/.npm` cache configured in one simple step:
|
||||
|
||||
```yaml
|
||||
jobs:
|
||||
build_and_test:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
- name: Install dependencies
|
||||
uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
run: |
|
||||
npm-install
|
||||
npm run build
|
||||
|
||||
pip-install
|
||||
python ./bin/manager.py fill_test_data
|
||||
```
|
||||
|
||||
Here we used predefined `npm-install` and `pip-install` commands to install dependencies with correponding caches.
|
||||
|
||||
You may also replace `npm-install` with `yarn-install` to install npm pacakges with `yarn.lock`.
|
||||
|
||||
```yaml
|
||||
- name: Install dependencies
|
||||
uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
run: |
|
||||
yarn-install
|
||||
yarn build
|
||||
|
||||
pip-install
|
||||
python ./bin/manager.py fill_test_data
|
||||
```
|
||||
|
||||
See below for more details.
|
||||
|
||||
## Usage
|
||||
|
||||
### Cache configs
|
||||
|
||||
Under the hood, we use [@actions/cache](https://github.com/marketplace/actions/cache) to manage cache storage. But instead of defining only one cache at a time and specify them in workflow YAMLs, you manage all caches in a spearate JS file: `.github/workflows/caches.js`.
|
||||
|
||||
Here is [the default configuration](https://github.com/ktmud/cached-dependencies/blob/master/src/cache/caches.ts) for Linux:
|
||||
|
||||
```js
|
||||
module.exports = {
|
||||
pip: {
|
||||
path: [`${process.env.HOME}/.cache/pip`],
|
||||
hashFiles: ['requirements*.txt'],
|
||||
keyPrefix: 'pip-',
|
||||
restoreKeys: 'pip-',
|
||||
},
|
||||
npm: {
|
||||
path: [`${HOME}/.npm`],
|
||||
hashFiles: [
|
||||
`package-lock.json`,
|
||||
`*/*/package-lock.json`,
|
||||
`!node_modules/*/package-lock.json`,
|
||||
],
|
||||
},
|
||||
yarn: {
|
||||
path: [`${HOME}/.npm`],
|
||||
// */* is for supporting lerna monorepo with depth=2
|
||||
hashFiles: [`yarn.lock`, `*/*/yarn.lock`, `!node_modules/*/yarn.lock`],
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
In which `hashFiles` and `keyPrefix` will be used to compute the primary cache key used in [@actions/cache](https://github.com/marketplace/actions/cache). `keyPrefix` will default to `${cacheName}-` and `restoreKeys` will default to `keyPrefix` if not specified.
|
||||
|
||||
It is recommended to always use absolute paths in these configs so you can share them across different worflows more easily (in case you the action is called from different working directories).
|
||||
|
||||
#### Speficy when to restore and save
|
||||
|
||||
With the predefined `cache-store` and `cache-save` bash commands, you have full flexibility on when to restore and save cache:
|
||||
|
||||
```yaml
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
run: |
|
||||
cache-restore npm
|
||||
npm install
|
||||
cache-save npm
|
||||
|
||||
cache-restore pip
|
||||
pip install -r requirements.txt
|
||||
cache-save pip
|
||||
```
|
||||
|
||||
### Shortcut commands
|
||||
|
||||
All predefined shortcut commands can be found [here](https://github.com/ktmud/cached-dependencies/blob/master/src/scripts/bashlib.sh). You can also customize them or add new ones in `.github/workflows/bashlib.sh`.
|
||||
|
||||
For example, if you want to install additional packages for before saving `pip` cache, simply add this to the `bashlib.sh` file:
|
||||
|
||||
```bash
|
||||
# override the default `pip-install` command
|
||||
pip-install() {
|
||||
cd $GITHUB_WORKSPACE
|
||||
|
||||
cache-restore pip
|
||||
|
||||
echo "::group::pip install"
|
||||
pip install -r requirements.txt # prod requirements
|
||||
pip install -r requirements-dev.txt # dev requirements
|
||||
pip install -e ".[postgres,mysql]" # current pacakge with some extras
|
||||
echo "::endgroup::"
|
||||
|
||||
cache-save pip
|
||||
}
|
||||
```
|
||||
|
||||
### Default setup command
|
||||
|
||||
When `run` is not provided:
|
||||
|
||||
```yaml
|
||||
jobs:
|
||||
name: Build
|
||||
steps:
|
||||
- name: Install dependencies
|
||||
uses: ktmud/cached-depdencies@v1
|
||||
```
|
||||
|
||||
You must provide a `default-setup-command` in the bashlib. For example,
|
||||
|
||||
```bash
|
||||
default-setup-command() {
|
||||
pip-install & npm-install
|
||||
}
|
||||
```
|
||||
|
||||
This will start installing pip and npm dependencies at the same time.
|
||||
|
||||
### Customize config locations
|
||||
|
||||
Both the two config files, `.github/workflows/bashlib.sh` and `.github/workflows/caches.js`, can be placed in other locations:
|
||||
|
||||
```yaml
|
||||
- uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
caches: ${{ github.workspace }}/.github/configs/caches.js
|
||||
bashlib: ${{ github.workspace }}/.github/configs/bashlib.sh
|
||||
```
|
||||
|
||||
### Run commands in parallel
|
||||
|
||||
When `parallel` is set to `true`, the `run` input will be split into an array of commands and passed to `Promise.all(...)` to execute in parallel. For example,
|
||||
|
||||
```yaml
|
||||
- uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
parallel: true
|
||||
run: |
|
||||
pip-install
|
||||
npm-install
|
||||
```
|
||||
|
||||
is equivalent to
|
||||
|
||||
```yaml
|
||||
- uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
run: |
|
||||
pip-install & npm-install
|
||||
```
|
||||
|
||||
If one or more of your commands must spread across multiple lines, you can add a new line between the parallel commands. Each command within a parallel group will still run sequentially.
|
||||
|
||||
```yaml
|
||||
- uses: ktmud/cached-dependencies@v1
|
||||
with:
|
||||
run: |
|
||||
cache-restore pip
|
||||
pip install requirements*.txt
|
||||
# additional pip packages
|
||||
pip install package1 package2 pacakge2
|
||||
cache-save pip
|
||||
|
||||
npm-install
|
||||
|
||||
cache-restore cypress
|
||||
cd cypress/ && npm install
|
||||
cache-save cypress
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
This project is released under [the MIT License](LICENSE).
|
||||
124
.github/actions/cached-dependencies/__tests__/cache.test.ts
vendored
Normal file
124
.github/actions/cached-dependencies/__tests__/cache.test.ts
vendored
Normal file
@@ -0,0 +1,124 @@
|
||||
import path from 'path';
|
||||
import * as fs from 'fs';
|
||||
import * as os from 'os';
|
||||
import * as core from '@actions/core';
|
||||
import * as cache from '../src/cache';
|
||||
import * as inputsUtils from '../src/utils/inputs';
|
||||
import * as actionUtils from '@actions/cache/src/utils/actionUtils';
|
||||
import defaultCaches from '../src/cache/caches';
|
||||
import { setInputs, getInput, maybeArrayToString } from '../src/utils/inputs';
|
||||
import { Inputs, InputName, GitHubEvent, EnvVariable } from '../src/constants';
|
||||
import caches, { npmExpectedHash } from './fixtures/caches';
|
||||
|
||||
describe('patch core states', () => {
|
||||
it('should log error if states file invalid', () => {
|
||||
const logWarningMock = jest.spyOn(actionUtils, 'logWarning');
|
||||
fs.writeFileSync(`${os.tmpdir()}/cached--states.json`, 'INVALID_JSON', {
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
core.getState('haha');
|
||||
expect(logWarningMock).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
it('should persist state', () => {
|
||||
core.saveState('test', '100');
|
||||
expect(core.getState('test')).toStrictEqual('100');
|
||||
});
|
||||
});
|
||||
|
||||
describe('cache runner', () => {
|
||||
it('should use default cache config', async () => {
|
||||
await cache.loadCustomCacheConfigs();
|
||||
// but `npm` actually come from `src/cache/caches.ts`
|
||||
const inputs = await cache.getCacheInputs('npm');
|
||||
expect(inputs?.[InputName.Path]).toStrictEqual(
|
||||
maybeArrayToString(defaultCaches.npm.path),
|
||||
);
|
||||
expect(inputs?.[InputName.RestoreKeys]).toStrictEqual('npm-');
|
||||
});
|
||||
|
||||
it('should override cache config', async () => {
|
||||
setInputs({
|
||||
[InputName.Caches]: path.resolve(__dirname, 'fixtures/caches'),
|
||||
});
|
||||
await cache.loadCustomCacheConfigs();
|
||||
|
||||
const inputs = await cache.getCacheInputs('npm');
|
||||
expect(inputs?.[InputName.Path]).toStrictEqual(
|
||||
maybeArrayToString(caches.npm.path),
|
||||
);
|
||||
expect(inputs?.[InputName.Key]).toStrictEqual(`npm-${npmExpectedHash}`);
|
||||
expect(inputs?.[InputName.RestoreKeys]).toStrictEqual(
|
||||
maybeArrayToString(caches.npm.restoreKeys),
|
||||
);
|
||||
});
|
||||
|
||||
it('should apply inputs and restore cache', async () => {
|
||||
setInputs({
|
||||
[InputName.Caches]: path.resolve(__dirname, 'fixtures/caches'),
|
||||
[EnvVariable.GitHubEventName]: GitHubEvent.PullRequest,
|
||||
});
|
||||
|
||||
const setInputsMock = jest.spyOn(inputsUtils, 'setInputs');
|
||||
const inputs = await cache.getCacheInputs('npm');
|
||||
const result = await cache.run('restore', 'npm');
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
|
||||
// before run
|
||||
expect(setInputsMock).toHaveBeenNthCalledWith(1, inputs);
|
||||
|
||||
// after run
|
||||
expect(setInputsMock).toHaveBeenNthCalledWith(2, {
|
||||
[InputName.Key]: '',
|
||||
[InputName.Path]: '',
|
||||
[InputName.RestoreKeys]: '',
|
||||
});
|
||||
|
||||
// inputs actually restored to original value
|
||||
expect(getInput(InputName.Key)).toStrictEqual('');
|
||||
|
||||
// pretend still in execution context
|
||||
setInputs(inputs as Inputs);
|
||||
|
||||
// `core.getState` should return the primary key
|
||||
expect(core.getState('CACHE_KEY')).toStrictEqual(inputs?.[InputName.Key]);
|
||||
|
||||
setInputsMock.mockRestore();
|
||||
});
|
||||
|
||||
it('should run saveCache', async () => {
|
||||
// call to save should also work
|
||||
const logWarningMock = jest.spyOn(actionUtils, 'logWarning');
|
||||
|
||||
setInputs({
|
||||
[InputName.Parallel]: 'true',
|
||||
});
|
||||
await cache.run('save', 'npm');
|
||||
expect(logWarningMock).toHaveBeenCalledWith(
|
||||
'Cache Service Url not found, unable to restore cache.',
|
||||
);
|
||||
});
|
||||
|
||||
it('should exit on invalid args', async () => {
|
||||
// other calls do generate errors
|
||||
const processExitMock = jest
|
||||
.spyOn(process, 'exit')
|
||||
// @ts-ignore
|
||||
.mockImplementation(() => {});
|
||||
|
||||
// incomplete arguments
|
||||
await cache.run();
|
||||
await cache.run('save');
|
||||
|
||||
// bad arguments
|
||||
await cache.run('save', 'unknown-cache');
|
||||
await cache.run('unknown-action', 'unknown-cache');
|
||||
|
||||
setInputs({
|
||||
[InputName.Caches]: 'non-existent',
|
||||
});
|
||||
await cache.run('save', 'npm');
|
||||
|
||||
expect(processExitMock).toHaveBeenCalledTimes(5);
|
||||
});
|
||||
});
|
||||
5
.github/actions/cached-dependencies/__tests__/fixtures/bashlib.sh
vendored
Normal file
5
.github/actions/cached-dependencies/__tests__/fixtures/bashlib.sh
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
default-setup-command() {
|
||||
print-cachescript-path
|
||||
}
|
||||
14
.github/actions/cached-dependencies/__tests__/fixtures/caches.ts
vendored
Normal file
14
.github/actions/cached-dependencies/__tests__/fixtures/caches.ts
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
/**
|
||||
* Example cache config.
|
||||
*/
|
||||
export const npmHashFiles = ['.*ignore'];
|
||||
export const npmExpectedHash =
|
||||
'13ed29a1c7ec906e7dcb20626957ebfcd3f0f2174bd2685a012105792bf1ff55';
|
||||
|
||||
export default {
|
||||
npm: {
|
||||
path: [`~/.npm`],
|
||||
hashFiles: npmHashFiles,
|
||||
restoreKeys: 'node-npm-',
|
||||
},
|
||||
};
|
||||
101
.github/actions/cached-dependencies/__tests__/setup.test.ts
vendored
Normal file
101
.github/actions/cached-dependencies/__tests__/setup.test.ts
vendored
Normal file
@@ -0,0 +1,101 @@
|
||||
/**
|
||||
* Test default runner.
|
||||
*/
|
||||
import { setInputs } from '../src/utils/inputs';
|
||||
import { InputName, DefaultInputs } from '../src/constants';
|
||||
import * as setup from '../src/setup';
|
||||
import path from 'path';
|
||||
|
||||
const extraBashlib = path.resolve(__dirname, './fixtures/bashlib.sh');
|
||||
|
||||
describe('setup runner', () => {
|
||||
// don't actually run the bash script
|
||||
const runCommandMock = jest.spyOn(setup, 'runCommand');
|
||||
|
||||
it('should allow custom bashlib', async () => {
|
||||
setInputs({
|
||||
[InputName.Bashlib]: extraBashlib,
|
||||
});
|
||||
await setup.run();
|
||||
expect(runCommandMock).toHaveBeenCalledTimes(1);
|
||||
expect(runCommandMock).toHaveBeenCalledWith(
|
||||
DefaultInputs[InputName.Run],
|
||||
extraBashlib,
|
||||
);
|
||||
});
|
||||
|
||||
it('should allow inline bash overrides', async () => {
|
||||
const processExitMock = jest
|
||||
.spyOn(process, 'exit')
|
||||
// @ts-ignore
|
||||
.mockImplementation(() => {});
|
||||
|
||||
setInputs({
|
||||
[InputName.Bashlib]: '',
|
||||
[InputName.Parallel]: 'false',
|
||||
[InputName.Run]: `
|
||||
${DefaultInputs[InputName.Run]}() {
|
||||
echo "It works!"
|
||||
exit 202
|
||||
}
|
||||
${DefaultInputs[InputName.Run]}
|
||||
`,
|
||||
});
|
||||
// allow the bash script to run for one test, but override the default
|
||||
await setup.run();
|
||||
expect(runCommandMock).toHaveBeenCalledTimes(1);
|
||||
expect(processExitMock).toHaveBeenCalledTimes(1);
|
||||
expect(processExitMock).toHaveBeenCalledWith(1);
|
||||
});
|
||||
|
||||
it('should use run commands', async () => {
|
||||
// don't run the commands when there is no overrides
|
||||
runCommandMock.mockImplementation(async () => {});
|
||||
|
||||
setInputs({
|
||||
[InputName.Bashlib]: 'non-existent',
|
||||
[InputName.Run]: 'print-cachescript-path',
|
||||
});
|
||||
|
||||
await setup.run();
|
||||
|
||||
expect(runCommandMock).toHaveBeenCalledTimes(1);
|
||||
expect(runCommandMock).toHaveBeenCalledWith('print-cachescript-path', '');
|
||||
});
|
||||
|
||||
it('should handle single-new-line parallel commands', async () => {
|
||||
setInputs({
|
||||
[InputName.Run]: `
|
||||
test-command-1
|
||||
test-command-2
|
||||
`,
|
||||
[InputName.Parallel]: 'true',
|
||||
});
|
||||
|
||||
await setup.run();
|
||||
|
||||
expect(runCommandMock).toHaveBeenNthCalledWith(1, 'test-command-1', '');
|
||||
expect(runCommandMock).toHaveBeenNthCalledWith(2, 'test-command-2', '');
|
||||
});
|
||||
|
||||
it('should handle multi-new-line parallel commands', async () => {
|
||||
setInputs({
|
||||
[InputName.Run]: `
|
||||
test-1-1
|
||||
test-1-2
|
||||
|
||||
test-2
|
||||
`,
|
||||
[InputName.Parallel]: 'true',
|
||||
});
|
||||
|
||||
await setup.run();
|
||||
|
||||
expect(runCommandMock).toHaveBeenNthCalledWith(
|
||||
1,
|
||||
'test-1-1\n test-1-2',
|
||||
'',
|
||||
);
|
||||
expect(runCommandMock).toHaveBeenNthCalledWith(2, 'test-2', '');
|
||||
});
|
||||
});
|
||||
10
.github/actions/cached-dependencies/__tests__/tsconfig.json
vendored
Normal file
10
.github/actions/cached-dependencies/__tests__/tsconfig.json
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
{
|
||||
"extends": "../tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"baseUrl": "./",
|
||||
"outDir": "../build",
|
||||
"noEmit": true,
|
||||
"rootDir": "../"
|
||||
},
|
||||
"exclude": ["node_modules"]
|
||||
}
|
||||
25
.github/actions/cached-dependencies/action.yml
vendored
Normal file
25
.github/actions/cached-dependencies/action.yml
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
name: Cached Dependencies
|
||||
description: Setup multi-layered cache and dependencies in one step, share predefined commands across workflows
|
||||
author: Jesse Yang <hello@yjc.me>
|
||||
branding:
|
||||
icon: layers
|
||||
color: yellow
|
||||
inputs:
|
||||
caches:
|
||||
required: false
|
||||
description: Path to a JS file with cache configs
|
||||
default: ${{ github.workspace }}/.github/workflows/caches.js
|
||||
bashlib:
|
||||
required: false
|
||||
description: Path to a Bash script with command shortcuts
|
||||
default: ${{ github.workspace }}/.github/workflows/bashlib.sh
|
||||
run:
|
||||
required: false
|
||||
description: Setup commands to run, can use shortcuts defined in bashlib
|
||||
default: default-setup-command
|
||||
parallel:
|
||||
required: false
|
||||
description: Whether to run commands in parallel
|
||||
runs:
|
||||
using: node12
|
||||
main: dist/index.js
|
||||
1757
.github/actions/cached-dependencies/dist/index.js
vendored
Normal file
1757
.github/actions/cached-dependencies/dist/index.js
vendored
Normal file
File diff suppressed because it is too large
Load Diff
6125
.github/actions/cached-dependencies/dist/scripts/cache/index.js
vendored
Normal file
6125
.github/actions/cached-dependencies/dist/scripts/cache/index.js
vendored
Normal file
File diff suppressed because it is too large
Load Diff
57
.github/actions/cached-dependencies/dist/scripts/cache/thread.js
vendored
Normal file
57
.github/actions/cached-dependencies/dist/scripts/cache/thread.js
vendored
Normal file
@@ -0,0 +1,57 @@
|
||||
'use strict';
|
||||
const fs = require('fs');
|
||||
const crypto = require('crypto');
|
||||
const {parentPort} = require('worker_threads');
|
||||
|
||||
const handlers = {
|
||||
hashFile: (algorithm, filePath) => new Promise((resolve, reject) => {
|
||||
const hasher = crypto.createHash(algorithm);
|
||||
fs.createReadStream(filePath)
|
||||
// TODO: Use `Stream.pipeline` when targeting Node.js 12.
|
||||
.on('error', reject)
|
||||
.pipe(hasher)
|
||||
.on('error', reject)
|
||||
.on('finish', () => {
|
||||
const {buffer} = hasher.read();
|
||||
resolve({value: buffer, transferList: [buffer]});
|
||||
});
|
||||
}),
|
||||
hash: async (algorithm, input) => {
|
||||
const hasher = crypto.createHash(algorithm);
|
||||
|
||||
if (Array.isArray(input)) {
|
||||
for (const part of input) {
|
||||
hasher.update(part);
|
||||
}
|
||||
} else {
|
||||
hasher.update(input);
|
||||
}
|
||||
|
||||
const hash = hasher.digest().buffer;
|
||||
return {value: hash, transferList: [hash]};
|
||||
}
|
||||
};
|
||||
|
||||
parentPort.on('message', async message => {
|
||||
try {
|
||||
const {method, args} = message;
|
||||
const handler = handlers[method];
|
||||
|
||||
if (handler === undefined) {
|
||||
throw new Error(`Unknown method '${method}'`);
|
||||
}
|
||||
|
||||
const {value, transferList} = await handler(...args);
|
||||
parentPort.postMessage({id: message.id, value}, transferList);
|
||||
} catch (error) {
|
||||
const newError = {message: error.message, stack: error.stack};
|
||||
|
||||
for (const [key, value] of Object.entries(error)) {
|
||||
if (typeof value !== 'object') {
|
||||
newError[key] = value;
|
||||
}
|
||||
}
|
||||
|
||||
parentPort.postMessage({id: message.id, error: newError});
|
||||
}
|
||||
});
|
||||
21
.github/actions/cached-dependencies/jest.config.js
vendored
Normal file
21
.github/actions/cached-dependencies/jest.config.js
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
module.exports = {
|
||||
clearMocks: true,
|
||||
moduleFileExtensions: ['js', 'ts'],
|
||||
testEnvironment: 'node',
|
||||
testMatch: ['**/*.test.ts'],
|
||||
transform: {
|
||||
'^.+\\.ts$': 'ts-jest',
|
||||
},
|
||||
transformIgnorePatterns: [
|
||||
'/node_modules/(?!@actions).+\\.js$',
|
||||
],
|
||||
verbose: true,
|
||||
};
|
||||
|
||||
// suppress debug messages
|
||||
const processStdoutWrite = process.stdout.write.bind(process.stdout);
|
||||
process.stdout.write = (str, encoding, cb) => {
|
||||
processStdoutWrite(str.split('\n').filter(x => {
|
||||
return !/^::debug::/.test(x);
|
||||
}).join('\n'), encoding, cb);
|
||||
};
|
||||
8197
.github/actions/cached-dependencies/package-lock.json
generated
vendored
Normal file
8197
.github/actions/cached-dependencies/package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
47
.github/actions/cached-dependencies/package.json
vendored
Normal file
47
.github/actions/cached-dependencies/package.json
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
{
|
||||
"name": "setup-superset-action",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"keywords": [
|
||||
"actions",
|
||||
"node",
|
||||
"setup",
|
||||
"superset"
|
||||
],
|
||||
"main": "dist/run",
|
||||
"scripts": {
|
||||
"all": "npm run format && npm run lint && npm run test && npm run build",
|
||||
"build": "npm run clean && tsc && ncc build -o dist src/run.ts && ncc build -o dist/scripts/cache src/scripts/cache.ts",
|
||||
"clean": "rm -rf ./lib ./dist",
|
||||
"coverage": "npm run test && open ./coverage/lcov-report/index.html",
|
||||
"format": "prettier --write **/*.ts",
|
||||
"format-check": "prettier --check **/*.ts",
|
||||
"lint": "eslint src/**/*.ts",
|
||||
"test": "jest --clearCache && jest --coverage"
|
||||
},
|
||||
"dependencies": {
|
||||
"@actions/cache": "actions/cache#d29c1df198dd38ac88e0ae23a2881b99c2d20e68",
|
||||
"@actions/core": "1.2.4",
|
||||
"@actions/exec": "1.0.4",
|
||||
"@actions/glob": "0.1.0",
|
||||
"@types/uuid": "7.0.4",
|
||||
"hasha": "5.2.0",
|
||||
"tempy": "0.6.0",
|
||||
"uuid": "7.0.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/jest": "26.0.7",
|
||||
"@types/node": "12.12.53",
|
||||
"@typescript-eslint/eslint-plugin": "3.7.1",
|
||||
"@typescript-eslint/parser": "3.7.1",
|
||||
"@zeit/ncc": "0.22.3",
|
||||
"eslint": "7.5.0",
|
||||
"eslint-plugin-jest": "23.19.0",
|
||||
"jest": "26.1.0",
|
||||
"js-yaml": "3.14.0",
|
||||
"prettier": "2.0.5",
|
||||
"prettier-plugin-packagejson": "2.2.5",
|
||||
"ts-jest": "26.1.4",
|
||||
"typescript": "3.9.7"
|
||||
}
|
||||
}
|
||||
5
.github/actions/cached-dependencies/renovate.json
vendored
Normal file
5
.github/actions/cached-dependencies/renovate.json
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"extends": [
|
||||
"config:base"
|
||||
]
|
||||
}
|
||||
49
.github/actions/cached-dependencies/src/cache/caches.ts
vendored
Normal file
49
.github/actions/cached-dependencies/src/cache/caches.ts
vendored
Normal file
@@ -0,0 +1,49 @@
|
||||
/**
|
||||
* Default cache configs
|
||||
*/
|
||||
import * as os from 'os';
|
||||
|
||||
export interface CacheConfig {
|
||||
path: string[] | string;
|
||||
hashFiles: string[] | string;
|
||||
keyPrefix?: string;
|
||||
restoreKeys?: string[] | string;
|
||||
}
|
||||
|
||||
export interface CacheConfigs {
|
||||
[cacheName: string]: CacheConfig;
|
||||
}
|
||||
|
||||
const { HOME = '~' } = process.env;
|
||||
const platform = os.platform() as 'linux' | 'darwin' | 'win32';
|
||||
const pathByPlatform = {
|
||||
linux: {
|
||||
pip: `${HOME}/.cache/pip`,
|
||||
},
|
||||
darwin: {
|
||||
pip: `${HOME}/Library/Caches/pip`,
|
||||
},
|
||||
win32: {
|
||||
pip: `${HOME}\\AppData\\Local\\pip\\Cache`,
|
||||
},
|
||||
};
|
||||
|
||||
export default {
|
||||
pip: {
|
||||
path: pathByPlatform[platform].pip,
|
||||
hashFiles: 'requirements*.txt',
|
||||
},
|
||||
npm: {
|
||||
path: `${HOME}/.npm`,
|
||||
hashFiles: [
|
||||
`package-lock.json`,
|
||||
// support lerna monorepo with depth=2
|
||||
`*/*/package-lock.json`,
|
||||
`!node_modules/*/package-lock.json`,
|
||||
],
|
||||
},
|
||||
yarn: {
|
||||
path: `${HOME}/.npm`,
|
||||
hashFiles: [`yarn.lock`, `*/*/yarn.lock`, `!node_modules/*/yarn.lock`],
|
||||
},
|
||||
} as CacheConfigs;
|
||||
146
.github/actions/cached-dependencies/src/cache/index.ts
vendored
Normal file
146
.github/actions/cached-dependencies/src/cache/index.ts
vendored
Normal file
@@ -0,0 +1,146 @@
|
||||
/**
|
||||
* Execute @actions/cache with predefined cache configs.
|
||||
*/
|
||||
import { beginImport, doneImport } from './patch'; // monkey patch @actions modules
|
||||
|
||||
beginImport();
|
||||
import saveCache from '@actions/cache/src/save';
|
||||
import restoreCache from '@actions/cache/src/restore';
|
||||
doneImport();
|
||||
|
||||
import hasha from 'hasha';
|
||||
import * as fs from 'fs';
|
||||
import * as core from '@actions/core';
|
||||
import * as glob from '@actions/glob';
|
||||
import { Inputs, InputName, DefaultInputs } from '../constants';
|
||||
import { applyInputs, getInput, maybeArrayToString } from '../utils/inputs';
|
||||
import caches from './caches'; // default cache configs
|
||||
|
||||
// GitHub uses `sha256` for the built-in `${{ hashFiles(...) }}` expression
|
||||
// https://help.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#hashfiles
|
||||
const HASH_OPTION = { algorithm: 'sha256' };
|
||||
|
||||
/**
|
||||
* Load custom cache configs from the `caches` path defined in inputs.
|
||||
*
|
||||
* @returns Whether the loading is successfull.
|
||||
*/
|
||||
export async function loadCustomCacheConfigs() {
|
||||
const customCachePath = getInput(InputName.Caches);
|
||||
try {
|
||||
core.debug(`Reading cache configs from '${customCachePath}'`);
|
||||
const customCache = await import(customCachePath);
|
||||
Object.assign(caches, customCache.default);
|
||||
} catch (error) {
|
||||
if (
|
||||
customCachePath !== DefaultInputs[InputName.Caches] ||
|
||||
!error.message.includes('Cannot find module')
|
||||
) {
|
||||
core.error(error.message);
|
||||
core.setFailed(
|
||||
`Failed to load custom cache configs: '${customCachePath}'`,
|
||||
);
|
||||
return process.exit(1);
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate SHA256 hash for a list of files matched by glob patterns.
|
||||
*
|
||||
* @param {string[]} patterns - The glob pattern.
|
||||
* @param {string} extra - The extra string to append to the file hashes to
|
||||
* comptue the final hash.
|
||||
*/
|
||||
export async function hashFiles(
|
||||
patterns: string[] | string,
|
||||
extra: string = '',
|
||||
) {
|
||||
const globber = await glob.create(maybeArrayToString(patterns));
|
||||
let hash = '';
|
||||
let counter = 0;
|
||||
for await (const file of globber.globGenerator()) {
|
||||
if (!fs.statSync(file).isDirectory()) {
|
||||
hash += hasha.fromFileSync(file, HASH_OPTION);
|
||||
counter += 1;
|
||||
}
|
||||
}
|
||||
core.debug(`Computed hash for ${counter} files. Pattern: ${patterns}`);
|
||||
return hasha(hash + extra, HASH_OPTION);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate GitHub Action inputs based on predefined cache config. Will be used
|
||||
* to override env variables.
|
||||
*
|
||||
* @param {string} cacheName - Name of the predefined cache config.
|
||||
*/
|
||||
export async function getCacheInputs(
|
||||
cacheName: string,
|
||||
): Promise<Inputs | null> {
|
||||
if (!(cacheName in caches)) {
|
||||
return null;
|
||||
}
|
||||
const { keyPrefix, restoreKeys, path, hashFiles: patterns } = caches[
|
||||
cacheName
|
||||
];
|
||||
const pathString = maybeArrayToString(path);
|
||||
const prefix = keyPrefix || `${cacheName}-`;
|
||||
// include `path` to hash, too, so to burse caches in case users change
|
||||
// the path definition.
|
||||
const hash = await hashFiles(patterns, pathString);
|
||||
return {
|
||||
[InputName.Key]: `${prefix}${hash}`,
|
||||
[InputName.Path]: pathString,
|
||||
// only use prefix as restore key if it is never defined
|
||||
[InputName.RestoreKeys]:
|
||||
restoreKeys === undefined ? prefix : maybeArrayToString(restoreKeys),
|
||||
};
|
||||
}
|
||||
|
||||
export const actions = {
|
||||
restore(inputs: Inputs) {
|
||||
return applyInputs(inputs, restoreCache);
|
||||
},
|
||||
save(inputs: Inputs) {
|
||||
return applyInputs(inputs, saveCache);
|
||||
},
|
||||
};
|
||||
|
||||
export type ActionChoice = keyof typeof actions;
|
||||
|
||||
export async function run(
|
||||
action: string | undefined = undefined,
|
||||
cacheName: string | undefined = undefined,
|
||||
) {
|
||||
if (!action || !(action in actions)) {
|
||||
core.setFailed(`Choose a cache action from: [restore, save]`);
|
||||
return process.exit(1);
|
||||
}
|
||||
if (!cacheName) {
|
||||
core.setFailed(`Must provide a cache name.`);
|
||||
return process.exit(1);
|
||||
}
|
||||
|
||||
const runInParallel = getInput(InputName.Parallel);
|
||||
|
||||
if (await loadCustomCacheConfigs()) {
|
||||
if (runInParallel) {
|
||||
core.info(`${action.toUpperCase()} cache for ${cacheName}`);
|
||||
} else {
|
||||
core.startGroup(`${action.toUpperCase()} cache for ${cacheName}`);
|
||||
}
|
||||
const inputs = await getCacheInputs(cacheName);
|
||||
if (inputs) {
|
||||
core.info(JSON.stringify(inputs, null, 2));
|
||||
await actions[action as ActionChoice](inputs);
|
||||
} else {
|
||||
core.setFailed(`Cache '${cacheName}' not defined, failed to ${action}.`);
|
||||
return process.exit(1);
|
||||
}
|
||||
if (!runInParallel) {
|
||||
core.endGroup();
|
||||
}
|
||||
}
|
||||
}
|
||||
95
.github/actions/cached-dependencies/src/cache/patch.ts
vendored
Normal file
95
.github/actions/cached-dependencies/src/cache/patch.ts
vendored
Normal file
@@ -0,0 +1,95 @@
|
||||
/**
|
||||
* Monkey patch to safely import and use @action/cache modules
|
||||
*/
|
||||
import * as utils from '@actions/cache/src/utils/actionUtils';
|
||||
import * as core from '@actions/core';
|
||||
import * as fs from 'fs';
|
||||
import * as os from 'os';
|
||||
import { InputName } from '../constants';
|
||||
import { getInput } from '../utils/inputs';
|
||||
|
||||
interface KeyValueStore {
|
||||
[key: string]: any;
|
||||
}
|
||||
|
||||
const { logWarning, isValidEvent } = utils;
|
||||
const { getState, saveState } = core;
|
||||
|
||||
function getStateStoreFile() {
|
||||
const cacheName = getInput(InputName.Key);
|
||||
return `${os.tmpdir()}/cached-${cacheName}-states.json`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load states from the persistent store.
|
||||
*
|
||||
* The default `core.saveState` only writes states as command output, and
|
||||
* `core.getState` is only possible to read the state in a later step via ENV
|
||||
* variables.
|
||||
*
|
||||
* So we use a temp file to save and load states, so to allow persistent
|
||||
* states within the same step.
|
||||
*
|
||||
* Since the state output is not uniq to caches, each cache should have their
|
||||
* own file for persistent states.
|
||||
*/
|
||||
function loadStates() {
|
||||
const stateStore = getStateStoreFile();
|
||||
const states: KeyValueStore = {};
|
||||
try {
|
||||
Object.assign(
|
||||
states,
|
||||
JSON.parse(fs.readFileSync(stateStore, { encoding: 'utf-8' })),
|
||||
);
|
||||
core.debug(`Loaded states from: ${stateStore}`)
|
||||
} catch (error) {
|
||||
// pass
|
||||
if (error.code !== 'ENOENT') {
|
||||
utils.logWarning(`Could not load states: ${stateStore}`)
|
||||
utils.logWarning(error.message);
|
||||
}
|
||||
}
|
||||
return states;
|
||||
}
|
||||
|
||||
/**
|
||||
* Save states to the persistent storage.
|
||||
*/
|
||||
function persistState(name: string, value: any) {
|
||||
const states = loadStates();
|
||||
const stateStore = getStateStoreFile();
|
||||
const valueString = typeof value === 'string' ? value : JSON.stringify(value);
|
||||
|
||||
// make sure value is always string
|
||||
states[name] = valueString;
|
||||
|
||||
// persist state in the temp file
|
||||
fs.writeFileSync(stateStore, JSON.stringify(states, null, 2), {
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
core.debug(`Persist state "${name}=${valueString}" to ${stateStore}`);
|
||||
|
||||
// still pass the original value to the original function, though
|
||||
return saveState(name, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get states from persistent store, fallback to "official" states.
|
||||
*/
|
||||
function obtainState(name: string) {
|
||||
const states = loadStates();
|
||||
return states[name] || getState(name);
|
||||
}
|
||||
|
||||
export function beginImport() {
|
||||
Object.defineProperty(utils, 'isValidEvent', { value: () => false });
|
||||
Object.defineProperty(utils, 'logWarning', { value: () => {} });
|
||||
}
|
||||
|
||||
export function doneImport() {
|
||||
Object.defineProperty(utils, 'isValidEvent', { value: isValidEvent });
|
||||
Object.defineProperty(utils, 'logWarning', { value: logWarning });
|
||||
|
||||
Object.defineProperty(core, 'saveState', { value: persistState });
|
||||
Object.defineProperty(core, 'getState', { value: obtainState });
|
||||
}
|
||||
43
.github/actions/cached-dependencies/src/constants.ts
vendored
Normal file
43
.github/actions/cached-dependencies/src/constants.ts
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
// Possible input names
|
||||
export enum InputName {
|
||||
// @actions/cache specific inputs
|
||||
Key = 'key',
|
||||
Path = 'path',
|
||||
RestoreKeys = 'restore-keys',
|
||||
|
||||
// setup-webapp specific inputs
|
||||
Run = 'run',
|
||||
Caches = 'caches',
|
||||
Bashlib = 'bashlib',
|
||||
Parallel = 'parallel',
|
||||
}
|
||||
|
||||
// Possible GitHub event names
|
||||
export enum GitHubEvent {
|
||||
Push = 'push',
|
||||
PullRequest = 'pull_request',
|
||||
}
|
||||
|
||||
// Directly available environment variables
|
||||
export enum EnvVariable {
|
||||
GitHubEventName = 'GITHUB_EVENT_NAME',
|
||||
}
|
||||
|
||||
export const EnvVariableNames = new Set(Object.values(EnvVariable) as string[]);
|
||||
|
||||
export interface Inputs {
|
||||
[EnvVariable.GitHubEventName]?: string;
|
||||
[InputName.Key]?: string;
|
||||
[InputName.RestoreKeys]?: string;
|
||||
[InputName.Path]?: string;
|
||||
[InputName.Caches]?: string;
|
||||
[InputName.Bashlib]?: string;
|
||||
[InputName.Run]?: string;
|
||||
[InputName.Parallel]?: string;
|
||||
}
|
||||
|
||||
export const DefaultInputs = {
|
||||
[InputName.Caches]: '.github/workflows/caches.js',
|
||||
[InputName.Bashlib]: '.github/workflows/bashlib.sh',
|
||||
[InputName.Run]: 'default-setup-command',
|
||||
} as Inputs;
|
||||
3
.github/actions/cached-dependencies/src/run.ts
vendored
Normal file
3
.github/actions/cached-dependencies/src/run.ts
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
import { run } from './setup';
|
||||
|
||||
run();
|
||||
61
.github/actions/cached-dependencies/src/scripts/bashlib.sh
vendored
Normal file
61
.github/actions/cached-dependencies/src/scripts/bashlib.sh
vendored
Normal file
@@ -0,0 +1,61 @@
|
||||
#!/bin/bash
|
||||
# -----------------------------------------------
|
||||
# Predefined command shortcuts
|
||||
# -----------------------------------------------
|
||||
|
||||
# Exit on any command fails
|
||||
set -e
|
||||
|
||||
bashSource=${BASH_SOURCE[${#BASH_SOURCE[@]} - 1]:-${(%):-%x}}
|
||||
cacheScript="$(dirname $(dirname $(dirname $bashSource)))/dist/scripts/cache"
|
||||
|
||||
print-cachescript-path() {
|
||||
echo $cacheScript
|
||||
}
|
||||
|
||||
cache-restore() {
|
||||
node $cacheScript restore $1
|
||||
}
|
||||
|
||||
cache-save() {
|
||||
node $cacheScript save $1
|
||||
}
|
||||
|
||||
# install python packages
|
||||
pip-install() {
|
||||
cache-restore pip
|
||||
echo "::group::Install Python pacakges"
|
||||
pip install -r requirements.txt # install dependencies
|
||||
pip install -e . # install current directory as editable python package
|
||||
echo "::endgroup"
|
||||
cache-save pip
|
||||
}
|
||||
|
||||
# install npm packages
|
||||
npm-install() {
|
||||
cache-restore npm
|
||||
echo "::group::Install npm pacakges"
|
||||
echo "npm: $(npm --version)"
|
||||
echo "node: $(node --version)"
|
||||
npm ci
|
||||
echo "::endgroup::"
|
||||
cache-save npm
|
||||
}
|
||||
|
||||
# install npm packages via yarn
|
||||
yarn-install() {
|
||||
cache-restore yarn
|
||||
echo "::group::Install npm pacakges via yarn"
|
||||
echo "npm: $(npm --version)"
|
||||
echo "node: $(node --version)"
|
||||
echo "yarn: $(yarn --version)"
|
||||
yarn
|
||||
echo "::endgroup::"
|
||||
cache-save yarn
|
||||
}
|
||||
|
||||
# default setup will install both pip and npm pacakges at the same time
|
||||
default-setup-command() {
|
||||
echo 'Please provide `run` commands or configure `default-setup-command`.'
|
||||
exit 1
|
||||
}
|
||||
18
.github/actions/cached-dependencies/src/scripts/cache.ts
vendored
Normal file
18
.github/actions/cached-dependencies/src/scripts/cache.ts
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
/**
|
||||
* Runner script to store/save caches by predefined configs.
|
||||
* Used in `scripts/bashlib.sh`.
|
||||
*/
|
||||
import { EnvVariable } from '../constants';
|
||||
|
||||
// To import `@actions/cache` modules safely, we must set GitHub event name to
|
||||
// a invalid value, so actual runner code doesn't execute.
|
||||
const originalEvent = process.env[EnvVariable.GitHubEventName];
|
||||
process.env[EnvVariable.GitHubEventName] = 'CACHE_HACK';
|
||||
|
||||
import { run } from '../cache';
|
||||
|
||||
// then we restore the event name before the job actually runs
|
||||
process.env[EnvVariable.GitHubEventName] = originalEvent;
|
||||
|
||||
// @ts-ignore
|
||||
run(...process.argv.slice(2));
|
||||
66
.github/actions/cached-dependencies/src/setup.ts
vendored
Normal file
66
.github/actions/cached-dependencies/src/setup.ts
vendored
Normal file
@@ -0,0 +1,66 @@
|
||||
/**
|
||||
* Load inputs and execute.
|
||||
*/
|
||||
import * as core from '@actions/core';
|
||||
import { exec } from '@actions/exec';
|
||||
import path from 'path';
|
||||
import fs from 'fs';
|
||||
import { DefaultInputs, InputName } from './constants';
|
||||
import { getInput } from './utils/inputs';
|
||||
|
||||
const SHARED_BASHLIB = path.resolve(__dirname, '../src/scripts/bashlib.sh');
|
||||
|
||||
/**
|
||||
* Run bash commands with predefined lib functions.
|
||||
*
|
||||
* @param {string} cmd - The bash commands to execute.
|
||||
*/
|
||||
export async function runCommand(
|
||||
cmd: string,
|
||||
extraBashlib: string,
|
||||
): Promise<void> {
|
||||
const bashlibCommands = [`source ${SHARED_BASHLIB}`];
|
||||
if (extraBashlib) {
|
||||
bashlibCommands.push(`source ${extraBashlib}`);
|
||||
}
|
||||
try {
|
||||
await exec('bash', ['-c', [...bashlibCommands, cmd].join('\n ')]);
|
||||
} catch (error) {
|
||||
core.setFailed(error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
export async function run(): Promise<void> {
|
||||
let bashlib = getInput(InputName.Bashlib);
|
||||
const rawCommands = getInput(InputName.Run);
|
||||
const runInParallel = getInput(InputName.Parallel);
|
||||
|
||||
if (!fs.existsSync(bashlib)) {
|
||||
if (bashlib !== DefaultInputs[InputName.Bashlib]) {
|
||||
core.error(`Custom bashlib "${bashlib}" does not exist.`);
|
||||
}
|
||||
// don't add bashlib to runCommand
|
||||
bashlib = '';
|
||||
}
|
||||
|
||||
if (runInParallel) {
|
||||
// Attempt to split by two or more new lines first, if there is still only
|
||||
// one command, attempt to split by one new line. This is because users
|
||||
// asked for parallelization, so we make our best efforts to get multiple
|
||||
// commands.
|
||||
let commands = rawCommands.split(/\n{2,}/);
|
||||
if (commands.length === 1) {
|
||||
commands = rawCommands.split('\n');
|
||||
}
|
||||
core.debug(`>> Run ${commands.length} commands in parallel...`);
|
||||
await Promise.all(
|
||||
commands
|
||||
.map(x => x.trim())
|
||||
.filter(x => !!x)
|
||||
.map(cmd => exports.runCommand(cmd, bashlib)),
|
||||
);
|
||||
} else if (rawCommands) {
|
||||
await exports.runCommand(rawCommands, bashlib);
|
||||
}
|
||||
}
|
||||
2
.github/actions/cached-dependencies/src/types/external.d.ts
vendored
Normal file
2
.github/actions/cached-dependencies/src/types/external.d.ts
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
declare module '@actions/cache/dist/restore';
|
||||
declare module '@actions/cache/dist/save';
|
||||
61
.github/actions/cached-dependencies/src/utils/inputs.ts
vendored
Normal file
61
.github/actions/cached-dependencies/src/utils/inputs.ts
vendored
Normal file
@@ -0,0 +1,61 @@
|
||||
/**
|
||||
* Manage inputs and env variables.
|
||||
*/
|
||||
import * as core from '@actions/core';
|
||||
import {
|
||||
Inputs,
|
||||
EnvVariableNames,
|
||||
InputName,
|
||||
DefaultInputs,
|
||||
} from '../constants';
|
||||
|
||||
export function getInput(name: keyof Inputs): string {
|
||||
const value = core.getInput(name);
|
||||
if (name === InputName.Parallel) {
|
||||
return value.toUpperCase() === 'TRUE' ? value : '';
|
||||
}
|
||||
return value || DefaultInputs[name] || '';
|
||||
}
|
||||
|
||||
/**
|
||||
* Update env variables associated with some inputs.
|
||||
* See: https://github.com/actions/toolkit/blob/5b940ebda7e7b86545fe9741903c930bc1191eb0/packages/core/src/core.ts#L69-L77 .
|
||||
*
|
||||
* @param {Inputs} inputs - The new inputs to apply to the env variables.
|
||||
*/
|
||||
export function setInputs(inputs: Inputs): void {
|
||||
for (const [name, value] of Object.entries(inputs)) {
|
||||
const envName = EnvVariableNames.has(name)
|
||||
? name
|
||||
: `INPUT_${name.replace(/ /g, '_').toUpperCase()}`;
|
||||
process.env[envName] = value;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply new inputs and execute a runner function, restore them when done.
|
||||
*
|
||||
* @param {Inputs} inputs - The new inputs to apply to the env variables before
|
||||
* excuting the runner.
|
||||
* @param {runner} runner - The runner function that returns a promise.
|
||||
* @returns {Promise<any>} - The result from the runner function.
|
||||
*/
|
||||
export async function applyInputs(
|
||||
inputs: Inputs,
|
||||
runner: () => Promise<void>,
|
||||
): Promise<any> {
|
||||
const originalInputs: Inputs = Object.fromEntries(
|
||||
Object.keys(inputs).map(name => [
|
||||
name,
|
||||
EnvVariableNames.has(name) ? process.env[name] : core.getInput(name),
|
||||
]),
|
||||
);
|
||||
exports.setInputs(inputs);
|
||||
const result = await runner();
|
||||
exports.setInputs(originalInputs);
|
||||
return result;
|
||||
}
|
||||
|
||||
export function maybeArrayToString(input: string[] | string) {
|
||||
return Array.isArray(input) ? input.join('\n') : input;
|
||||
}
|
||||
19
.github/actions/cached-dependencies/tsconfig.json
vendored
Normal file
19
.github/actions/cached-dependencies/tsconfig.json
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "es6",
|
||||
"module": "commonjs",
|
||||
"lib": ["esnext"],
|
||||
"moduleResolution": "node",
|
||||
"outDir": "./lib",
|
||||
"rootDir": ".",
|
||||
"strict": true,
|
||||
"noImplicitAny": true,
|
||||
"esModuleInterop": true,
|
||||
"preserveSymlinks": true
|
||||
},
|
||||
"include": [
|
||||
"./src",
|
||||
"./node_modules/@actions"
|
||||
],
|
||||
"exclude": ["**/*.test.ts", "__tests__"]
|
||||
}
|
||||
3
.github/actions/cancel-workflow-runs/.eslintignore
vendored
Normal file
3
.github/actions/cancel-workflow-runs/.eslintignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
dist/
|
||||
lib/
|
||||
node_modules/
|
||||
58
.github/actions/cancel-workflow-runs/.eslintrc.json
vendored
Normal file
58
.github/actions/cancel-workflow-runs/.eslintrc.json
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
{
|
||||
"plugins": ["jest", "@typescript-eslint"],
|
||||
"extends": ["plugin:github/es6"],
|
||||
"parser": "@typescript-eslint/parser",
|
||||
"parserOptions": {
|
||||
"ecmaVersion": 9,
|
||||
"sourceType": "module",
|
||||
"project": "./tsconfig.json"
|
||||
},
|
||||
"rules": {
|
||||
"eslint-comments/no-use": "off",
|
||||
"import/no-namespace": "off",
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": "error",
|
||||
"@typescript-eslint/explicit-member-accessibility": ["error", {"accessibility": "no-public"}],
|
||||
"@typescript-eslint/no-require-imports": "error",
|
||||
"@typescript-eslint/array-type": "error",
|
||||
"@typescript-eslint/await-thenable": "error",
|
||||
"@typescript-eslint/ban-ts-ignore": "error",
|
||||
"camelcase": "off",
|
||||
"@typescript-eslint/camelcase": "error",
|
||||
"@typescript-eslint/class-name-casing": "error",
|
||||
"@typescript-eslint/explicit-function-return-type": ["error", {"allowExpressions": true}],
|
||||
"@typescript-eslint/func-call-spacing": ["error", "never"],
|
||||
"@typescript-eslint/generic-type-naming": ["error", "^[A-Z][A-Za-z]*$"],
|
||||
"@typescript-eslint/no-array-constructor": "error",
|
||||
"@typescript-eslint/no-empty-interface": "error",
|
||||
"@typescript-eslint/no-explicit-any": "error",
|
||||
"@typescript-eslint/no-extraneous-class": "error",
|
||||
"@typescript-eslint/no-for-in-array": "error",
|
||||
"@typescript-eslint/no-inferrable-types": "error",
|
||||
"@typescript-eslint/no-misused-new": "error",
|
||||
"@typescript-eslint/no-namespace": "error",
|
||||
"@typescript-eslint/no-non-null-assertion": "warn",
|
||||
"@typescript-eslint/no-object-literal-type-assertion": "error",
|
||||
"@typescript-eslint/no-unnecessary-qualifier": "error",
|
||||
"@typescript-eslint/no-unnecessary-type-assertion": "error",
|
||||
"@typescript-eslint/no-useless-constructor": "error",
|
||||
"@typescript-eslint/no-var-requires": "error",
|
||||
"@typescript-eslint/prefer-for-of": "warn",
|
||||
"@typescript-eslint/prefer-function-type": "warn",
|
||||
"@typescript-eslint/prefer-includes": "error",
|
||||
"@typescript-eslint/prefer-interface": "error",
|
||||
"@typescript-eslint/prefer-string-starts-ends-with": "error",
|
||||
"@typescript-eslint/promise-function-async": "error",
|
||||
"@typescript-eslint/require-array-sort-compare": "error",
|
||||
"@typescript-eslint/restrict-plus-operands": "error",
|
||||
"semi": "off",
|
||||
"@typescript-eslint/semi": ["error", "never"],
|
||||
"@typescript-eslint/type-annotation-spacing": "error",
|
||||
"@typescript-eslint/unbound-method": "error"
|
||||
},
|
||||
"env": {
|
||||
"node": true,
|
||||
"es6": true,
|
||||
"jest/globals": true
|
||||
}
|
||||
}
|
||||
36
.github/actions/cancel-workflow-runs/.github/workflows/test.yml
vendored
Normal file
36
.github/actions/cancel-workflow-runs/.github/workflows/test.yml
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
name: "Test the build"
|
||||
on: # rebuild any PRs and main branch changes
|
||||
pull_request:
|
||||
push:
|
||||
|
||||
jobs:
|
||||
pre-commit: # make sure pre-commits work properly
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: 3.6
|
||||
- name: Cache npm env
|
||||
uses: actions/cache@v2
|
||||
env:
|
||||
cache-name: cache-npm-v1
|
||||
with:
|
||||
path: node_modules
|
||||
key: ${{ env.cache-name }}-${{ github.job }}-${{ hashFiles('package.json','package-lock.json') }}
|
||||
- name: "Install dependencies for npm"
|
||||
run: |
|
||||
npm ci
|
||||
- name: Cache pre-commit env
|
||||
uses: actions/cache@v2
|
||||
env:
|
||||
cache-name: cache-pre-commit-v1
|
||||
with:
|
||||
path: ~/.cache/pre-commit
|
||||
key: ${{ env.cache-name }}-${{ github.job }}-${{ hashFiles('.pre-commit-config.yaml') }}
|
||||
- name: "Install pre-commit"
|
||||
run: |
|
||||
pip install pre-commit
|
||||
- name: "Run pre-commit"
|
||||
run: |
|
||||
pre-commit run --all-files --show-diff-on-failure --color always
|
||||
101
.github/actions/cancel-workflow-runs/.gitignore
vendored
Normal file
101
.github/actions/cancel-workflow-runs/.gitignore
vendored
Normal file
@@ -0,0 +1,101 @@
|
||||
# Dependency directory
|
||||
node_modules
|
||||
|
||||
# Rest pulled from https://github.com/github/gitignore/blob/master/Node.gitignore
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
lerna-debug.log*
|
||||
|
||||
# Diagnostic reports (https://nodejs.org/api/report.html)
|
||||
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
|
||||
|
||||
# Runtime data
|
||||
pids
|
||||
*.pid
|
||||
*.seed
|
||||
*.pid.lock
|
||||
|
||||
# Directory for instrumented libs generated by jscoverage/JSCover
|
||||
lib-cov
|
||||
|
||||
# Coverage directory used by tools like istanbul
|
||||
coverage
|
||||
*.lcov
|
||||
|
||||
# nyc test coverage
|
||||
.nyc_output
|
||||
|
||||
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
|
||||
.grunt
|
||||
|
||||
# Bower dependency directory (https://bower.io/)
|
||||
bower_components
|
||||
|
||||
# node-waf configuration
|
||||
.lock-wscript
|
||||
|
||||
# Compiled binary addons (https://nodejs.org/api/addons.html)
|
||||
build/Release
|
||||
|
||||
# Dependency directories
|
||||
jspm_packages/
|
||||
|
||||
# TypeScript v1 declaration files
|
||||
typings/
|
||||
|
||||
# TypeScript cache
|
||||
*.tsbuildinfo
|
||||
|
||||
# Optional npm cache directory
|
||||
.npm
|
||||
|
||||
# Optional eslint cache
|
||||
.eslintcache
|
||||
|
||||
# Optional REPL history
|
||||
.node_repl_history
|
||||
|
||||
# Output of 'npm pack'
|
||||
*.tgz
|
||||
|
||||
# Yarn Integrity file
|
||||
.yarn-integrity
|
||||
|
||||
# dotenv environment variables file
|
||||
.env
|
||||
.env.test
|
||||
|
||||
# parcel-bundler cache (https://parceljs.org/)
|
||||
.cache
|
||||
|
||||
# next.js build output
|
||||
.next
|
||||
|
||||
# nuxt.js build output
|
||||
.nuxt
|
||||
|
||||
# vuepress build output
|
||||
.vuepress/dist
|
||||
|
||||
# Serverless directories
|
||||
.serverless/
|
||||
|
||||
# FuseBox cache
|
||||
.fusebox/
|
||||
|
||||
# DynamoDB Local files
|
||||
.dynamodb/
|
||||
|
||||
# OS metadata
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Ignore built ts files
|
||||
__tests__/runner/*
|
||||
lib/**/*
|
||||
|
||||
.idea
|
||||
47
.github/actions/cancel-workflow-runs/.pre-commit-config.yaml
vendored
Normal file
47
.github/actions/cancel-workflow-runs/.pre-commit-config.yaml
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
---
|
||||
default_stages: [commit, push]
|
||||
default_language_version:
|
||||
# force all unspecified python hooks to run python3
|
||||
python: python3
|
||||
minimum_pre_commit_version: "1.20.0"
|
||||
repos:
|
||||
- repo: https://github.com/Lucas-C/pre-commit-hooks
|
||||
rev: v1.1.7
|
||||
hooks:
|
||||
- id: forbid-tabs
|
||||
exclude: ^dist/index.js$
|
||||
- repo: https://github.com/thlorenz/doctoc.git
|
||||
rev: v1.4.0
|
||||
hooks:
|
||||
- id: doctoc
|
||||
name: Add TOC for md files
|
||||
files: ^README\.md$|^CONTRIBUTING\.md$|^UPDATING.md$|^dev/README\.md$|^dev/BACKPORT_PACKAGES.md$
|
||||
- repo: meta
|
||||
hooks:
|
||||
- id: check-hooks-apply
|
||||
- repo: https://github.com/adrienverge/yamllint
|
||||
rev: v1.23.0
|
||||
hooks:
|
||||
- id: yamllint
|
||||
name: Check yaml files with yamllint
|
||||
entry: yamllint -c yamllint-config.yml
|
||||
types: [yaml]
|
||||
exclude: ^.*init_git_sync\.template\.yaml$|^.*airflow\.template\.yaml$|^chart/templates/.*\.yaml$
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: build
|
||||
name: Build package for distribution
|
||||
language: system
|
||||
entry: bash -c "npm run release"
|
||||
files: .*\.ts$
|
||||
require_serial: true
|
||||
pass_filenames: false
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v3.1.0
|
||||
hooks:
|
||||
- id: check-merge-conflict
|
||||
- id: detect-private-key
|
||||
- id: end-of-file-fixer
|
||||
exclude: ^dist/.*
|
||||
- id: trailing-whitespace
|
||||
exclude: ^dist/.*
|
||||
3
.github/actions/cancel-workflow-runs/.prettierignore
vendored
Normal file
3
.github/actions/cancel-workflow-runs/.prettierignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
dist/
|
||||
lib/
|
||||
node_modules/
|
||||
11
.github/actions/cancel-workflow-runs/.prettierrc.json
vendored
Normal file
11
.github/actions/cancel-workflow-runs/.prettierrc.json
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"printWidth": 80,
|
||||
"tabWidth": 2,
|
||||
"useTabs": false,
|
||||
"semi": false,
|
||||
"singleQuote": true,
|
||||
"trailingComma": "none",
|
||||
"bracketSpacing": false,
|
||||
"arrowParens": "avoid",
|
||||
"parser": "typescript"
|
||||
}
|
||||
22
.github/actions/cancel-workflow-runs/LICENSE
vendored
Normal file
22
.github/actions/cancel-workflow-runs/LICENSE
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2018 GitHub, Inc. and contributors
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
731
.github/actions/cancel-workflow-runs/README.md
vendored
Normal file
731
.github/actions/cancel-workflow-runs/README.md
vendored
Normal file
@@ -0,0 +1,731 @@
|
||||
<p><a href="https://github.com/potiuk/cancel-workflow-runs/actions">
|
||||
<img alt="cancel-workflow-runs status"
|
||||
src="https://github.com/potiuk/cancel-workflow-runs/workflows/Test%20the%20build/badge.svg"></a>
|
||||
|
||||
# Cancel Workflow Runs action
|
||||
|
||||
|
||||
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
|
||||
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
|
||||
**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)*
|
||||
|
||||
- [Context and motivation](#context-and-motivation)
|
||||
- [Usage](#usage)
|
||||
- [Inputs and outputs](#inputs-and-outputs)
|
||||
- [Inputs](#inputs)
|
||||
- [Outputs](#outputs)
|
||||
- [Most often used canceling example](#most-often-used-canceling-example)
|
||||
- [More Examples](#more-examples)
|
||||
- [Repositories that use Pull Requests from forks](#repositories-that-use-pull-requests-from-forks)
|
||||
- [Cancel duplicate runs for the source workflow](#cancel-duplicate-runs-for-the-source-workflow)
|
||||
- [Cancel duplicate jobs for triggered workflow](#cancel-duplicate-jobs-for-triggered-workflow)
|
||||
- [Cancel the "self" source workflow run](#cancel-the-self-source-workflow-run)
|
||||
- [Cancel the "self" triggered workflow run](#cancel-the-self-triggered-workflow-run)
|
||||
- [Fail-fast source workflow runs with failed jobs](#fail-fast-source-workflow-runs-with-failed-jobs)
|
||||
- [Fail-fast source workflow runs with failed jobs and corresponding triggered runs](#fail-fast-source-workflow-runs-with-failed-jobs-and-corresponding-triggered-runs)
|
||||
- [Fail-fast for triggered workflow runs with failed jobs](#fail-fast-for-triggered-workflow-runs-with-failed-jobs)
|
||||
- [Cancel another workflow run](#cancel-another-workflow-run)
|
||||
- [Cancel all duplicates for named jobs](#cancel-all-duplicates-for-named-jobs)
|
||||
- [Repositories that do not use Pull Requests from forks](#repositories-that-do-not-use-pull-requests-from-forks)
|
||||
- [Cancel duplicate runs for "self" workflow](#cancel-duplicate-runs-for-self-workflow)
|
||||
- [Cancel "self" workflow run](#cancel-self-workflow-run)
|
||||
- [Fail-fast workflow runs with failed jobs](#fail-fast-workflow-runs-with-failed-jobs)
|
||||
- [Cancel all runs with named jobs](#cancel-all-runs-with-named-jobs)
|
||||
- [Development environment](#development-environment)
|
||||
- [License](#license)
|
||||
|
||||
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
|
||||
|
||||
# Context and motivation
|
||||
|
||||
Cancel Workflow Runs is an action that utilizes `workflow_run` triggers in order to perform various
|
||||
run cancel operations. The idea is to save number of jobs and free them for other queued runs. It is
|
||||
particularly useful in case your projects development flow where contributors submit pull requests
|
||||
from forks. Using `workflow_run` trigger enables safe canceling of runs triggered by such pull requests.
|
||||
|
||||
In case your CI takes a lot of time and uses a lot of jobs, the action might help your project
|
||||
to reduce job usage and decrease waiting time as it detects and cancels runs that are still executed,
|
||||
but we know already they are superseded by newer runs.
|
||||
|
||||
The main purpose of this action is canceling duplicated runs for the same branch as the current run,
|
||||
effectively limiting the resource consumption of the workflow to one run per branch. In short, the action
|
||||
is useful if you want to limit jobs usage on GitHub Actions in case of the usage pattern
|
||||
when fixups/rebases are pushed in quick succession to the same branch (fast iterations on a Pull Request).
|
||||
This is achieved by `duplicates` cancel mode. The `duplicates` mode only cancels "past" runs - it does
|
||||
not take into account runs that were started after the "current" run.
|
||||
|
||||
Another use case is to cancel the `pull_request` corresponding to the `workflow_run` triggered run.
|
||||
This can happen when the triggered `workflow_run` finds that it makes no sense to proceed with
|
||||
the source run. This is achieved by `self` cancel mode.
|
||||
|
||||
There are also two supplementary cancel modes for the action. Those supplementary use cases allow for further
|
||||
optimisations - failing fast in case we detect that important job failed and canceling duplicates of the
|
||||
`workflow_run` triggered events in case they execute some heavy jobs. This is achieved by `failedJobs` and
|
||||
`namedJobs` cancel modes.
|
||||
|
||||
Note that `namedjobs` cancel mode is solely for the purpose of bypassing current limitations
|
||||
of GitHub Actions. Currently, there is no way to retrieve connection between triggering and triggered
|
||||
workflow in case of `workflow_run`, as well as retrieving repository and branch of the triggering
|
||||
workflow. The action uses workaround - it requires designing workflows in the way that they pass necessary
|
||||
information via carefully crafted job names. The job names are accessible via GitHub API, and they can be
|
||||
resolved during execution of the workflow using information about the linked workflow available
|
||||
at the workflow runtime. Hopefully this information will soon be available in GitHub Actions allowing
|
||||
removal of `namedJobs` cancel mode and simplifying the examples and workflows using the Action.
|
||||
|
||||
Another feature of the Action is to notify the PRs linked to the workflows. Normally when workflows
|
||||
get cancelled there is no information why it happens, but this action can add an explanatory comment
|
||||
to the PR if the PR gets cancelled. This is controlled by `notifyPRCancel` boolean input.
|
||||
|
||||
Also, for the `workflow_run` events, GitHub does not yet provide an easy interface linking the original
|
||||
Pull Request and the Workflow_run. You can ask the CancelWorkflowRun action to add extra comment to the PR
|
||||
adding explanatory message followed by a link to the `workflow_run` run.
|
||||
|
||||
You can take a look at the description provided in the
|
||||
[Apache Airflow's CI](https://github.com/apache/airflow/blob/master/CI.rst) and
|
||||
[the workflows](https://github.com/apache/airflow/blob/master/.github/workflows)
|
||||
|
||||
Started from simple cancel workflow developed by [n1hility](https://github.com/n1hility)
|
||||
that implemented cancelling previous runs before introducing `workflow_run` type of event by
|
||||
GitHub Actions: [Cancel](https://github.com/n1hility/cancel-previous-runs).
|
||||
|
||||
# Usage
|
||||
|
||||
If you want a comprehensive solution, you should use the action as follows:
|
||||
|
||||
1) In case your project does not use public forks, it's enough to have one action with the `duplicates`
|
||||
cancel mode in the workflow. This is a rare thing in open-source projects (usually those projects
|
||||
accept pull requests from forks) and more often applicable for private repositories.
|
||||
|
||||
2) If you use forks, you should create a separate "Cancelling" `workflow_run` triggered workflow.
|
||||
The `workflow_run` should be responsible for all canceling actions. The examples below show
|
||||
the possible ways the action can be utilized.
|
||||
|
||||
# Inputs and outputs
|
||||
|
||||
## Inputs
|
||||
|
||||
| Input | Required | Default | Comment |
|
||||
|--------------------------|----------|--------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| `token` | yes | | The github token passed from `${{ secrets.GITHUB_TOKEN }}` |
|
||||
| `cancelMode` | no | `duplicates` | The mode to run cancel on. The available options are `duplicates`, `self`, `failedJobs`, `namedJobs` |
|
||||
| `cancelFutureDuplicates` | no | true | In case of duplicate canceling, cancel also future duplicates leaving only the "freshest" running job and not all the future jobs. By default it is set to true. |
|
||||
| `sourceRunId` | no | | Useful only in `workflow_run` triggered events. It should be set to the id of the workflow triggering the run `${{ github.event.workflow_run.id }}` in case cancel operation should cancel the source workflow. |
|
||||
| `notifyPRCancel` | no | | Boolean. If set to true, it notifies the cancelled PRs with a comment containing reason why they are being cancelled. |
|
||||
| `notifyPRCancelMessage` | no | | Optional cancel message to use instead of the default one when notifyPRCancel is true. It is only used in 'self' cancelling mode. |
|
||||
| `notifyPRMessageStart` | no | | Only for workflow_run events triggered by the PRs. If not empty, it notifies those PRs with the message specified at the start of the workflow - adding the link to the triggered workflow_run. |
|
||||
| `jobNameRegexps` | no | | An array of job name regexps. Only runs containing any job name matching any of of the regexp in this array are considered for cancelling in `failedJobs` and `namedJobs` and `allDuplicateNamedJobs` modes. |
|
||||
| `skipEventTypes` | no | | Array of event names that should be skipped when cancelling (JSON-encoded string). This might be used in order to skip direct pushes or scheduled events. |
|
||||
| `selfPreservation` | no | true | Do not cancel self. |
|
||||
| `workflowFileName` | no | | Name of the workflow file. It can be used if you want to cancel a different workflow than yours. |
|
||||
|
||||
|
||||
The job cancel modes work as follows:
|
||||
|
||||
| Cancel Mode | No `sourceRunId` specified | The `sourceRunId` set to `${{ github.event.workflow_run.id }}` |
|
||||
|--------------------------|------------------------------------------------------------------------------|-------------------------------------------------------------------------------------|
|
||||
| `duplicates` | Cancels duplicate runs from the same repo/branch as current run. | Cancels duplicate runs for the same repo/branch as the source run. |
|
||||
| `allDuplicates` | Cancels duplicate runs from all running workflows. | Cancels duplicate runs from all running workflows. |
|
||||
| `self` | Cancels self run. | Cancel the `sourceRunId` run. |
|
||||
| `failedJobs` | Cancels all runs of own workflow that have matching jobs that failed. | Cancels all runs of the `sourceRunId` workflow that have matching jobs that failed. |
|
||||
| `namedJobs` | Cancels all runs of own workflow that have matching jobs. | Cancels all runs of the `sourceRunId` workflow that have matching jobs. |
|
||||
| `allDuplicatedNamedJobs` | Cancels all duplicate runs of own workflow that share matching jobs pattern. | Cancels all runs of the `sourceRunId` workflow that share matching job pattern. |
|
||||
|
||||
|
||||
## Outputs
|
||||
|
||||
| Output | No `sourceRunId` specified | The `sourceRunId` set to `${{ github.event.workflow_run.id }}` |
|
||||
|---------------------|---------------------------------------------------------|------------------------------------------------------------------------------------------------------|
|
||||
| `sourceHeadRepo` | Current repository. Format: `owner/repo` | Repository of the run that triggered this `workflow_run`. Format: `owner/repo` |
|
||||
| `sourceHeadBranch` | Current branch. | Branch of the run that triggered this `workflow_run`. Might be forked repo, if it is a pull_request. |
|
||||
| `sourceHeadSha` | Current commit SHA: `{{ github.sha }}` | Commit sha of the run that triggered this `workflow_run`. |
|
||||
| `mergeCommitSha` | Merge commit SHA if PR-triggered event. | Merge commit SHA if PR-triggered event. |
|
||||
| `targetCommitSha` | Target commit SHA (merge if present, otherwise source). | Target commit SHA (merge if present, otherwise source). |
|
||||
| `pullRequestNumber` | Number of the associated Pull Request (if PR triggered) | Number of the associated Pull Request (if PR triggered) |
|
||||
| `sourceEvent` | Current event: ``${{ github.event }}`` | Event of the run that triggered this `workflow_run` |
|
||||
| `cancelledRuns` | JSON-stringified array of cancelled run ids. | JSON-stringified array of cancelled run ids. |
|
||||
|
||||
## Most often used canceling example
|
||||
|
||||
The most common canceling example is that you want to cancel all duplicates appearing in your build queue.
|
||||
As of 4.1 version of the Action this can be realised by single workflow run that can cancel all duplicates
|
||||
for all running workflows. It is resistant to temporary queues - as it can cancel also the future, queued
|
||||
workflows that have duplicated, fresher (also queued workflows and this is recommended for everyone.
|
||||
|
||||
The below example is a "workflow_run" type of event. The workflow_run event always has "write" access that allows
|
||||
it to cancel other workflows - even if they are coming from pull request.
|
||||
|
||||
```yaml
|
||||
name: Cancelling Duplicates
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
cancel-duplicate-workflow-runs:
|
||||
name: "Cancel duplicate workflow runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel duplicate workflow runs"
|
||||
with:
|
||||
cancelMode: allDuplicates
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
sourceRunId: ${{ github.event.workflow_run.id }}
|
||||
```
|
||||
|
||||
|
||||
# More Examples
|
||||
|
||||
Note that you can combine the steps below in several steps of the same job. The examples here are showing
|
||||
one step per case for clarity.
|
||||
|
||||
## Repositories that use Pull Requests from forks
|
||||
|
||||
Note that in case you implement separate "Canceling workflow", following the examples below, you do not
|
||||
need to add cancel action to any other workflow. All Cancel actions should be configured in this
|
||||
Cancelling workflow.
|
||||
|
||||
Those examples show how you should configure your project with separate `Cancelling` workflow which is
|
||||
triggered via `workflow_run` trigger.
|
||||
|
||||
In the example belows we use the following names:
|
||||
|
||||
* **triggered workflow** - the "Cancelling" workflow - separate workflow triggered by the `workflow_run`
|
||||
event. Its main job is to manage cancelling of other workflows.
|
||||
|
||||
* **triggered run** - the run of the *triggered workflow*. It is triggered by another ("source") run. In the
|
||||
examples below, this run is in "Cancelling" workflow. It always runs in the context of the main repository,
|
||||
even if it is triggered by a Pull Request from a fork.
|
||||
|
||||
* **source workflow** - the "main" workflow - main workflow that performs CI actions. In the examples below,
|
||||
this is a "CI" workflow.
|
||||
|
||||
* **source run** - the run of the *source workflow*. It is the run that triggers the *triggered run*,
|
||||
and it runs most of the CI tasks. In the examples below those are the runs of "CI" workflow.
|
||||
|
||||
### Cancel duplicate runs for the source workflow
|
||||
|
||||
Cancel past, duplicate *source runs* of the *source workflow*. This workflow cancels
|
||||
duplicated, past runs (for the same branch/repo that those associated with the *source run* that triggered
|
||||
the *triggered run*). You have to create it with the `sourceRunId` input with the value of
|
||||
`${{ github.event.workflow_run.id }}` in order to work correctly.
|
||||
|
||||
In the example below, the `Canceling` run cancels past, duplicate runs from the `CI` with the same
|
||||
branch/repo as the *source run* which triggered it - effectively what's left after the action is only
|
||||
the latest *source run* of "CI" from the same branch/repo.
|
||||
|
||||
This works for all kind of triggering events (`push`, `pull_request`, `schedule` ...). It works for
|
||||
events triggered in the local repository, as well as triggered from the forks, so you do not need
|
||||
to set up any extra actions to cancel internal Pushes/Pull Requests.
|
||||
|
||||
You can also choose to skip certain types of events (for example `push` and `schedule` if you want your
|
||||
jobs to run to full completion for this kind of events.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
cancel-duplicate-workflow-runs:
|
||||
name: "Cancel duplicate workflow runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel duplicate workflow runs"
|
||||
with:
|
||||
cancelMode: duplicates
|
||||
cancelFutureDuplicates: true
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
sourceRunId: ${{ github.event.workflow_run.id }}
|
||||
notifyPRCancel: true
|
||||
skipEventTypes: '["push", "schedule"]'
|
||||
```
|
||||
|
||||
Note that `duplicate` cancel mode cannot be used for `workflow_run` type of event without `sourceId` input.
|
||||
The action will throw an error in this case because it is not really doing what you would expect it to do.
|
||||
All `workflow_run` events have the same branch and repository (they are all run in the context of the
|
||||
target branch and repository) no matter what is the source of the event, therefore cancelling duplicates
|
||||
would cancel all the runs originated from all the branches and this is not really expected.
|
||||
|
||||
If you want to cancel duplicate runs of the *triggered workflow*, you need to utilize the
|
||||
`namedJob` cancel mode as described in the next chapter
|
||||
[Cancel duplicate jobs for triggered workflow](#cancel-duplicate-jobs-for-triggered-workflow) using outputs
|
||||
from the duplicate canceling for *source workflow* run above.
|
||||
|
||||
Hopefully we will have an easier way of doing that in the future once GitHub Actions API will allow
|
||||
searching for source runs (it's not available at this moment).
|
||||
|
||||
### Cancel duplicate jobs for triggered workflow
|
||||
|
||||
Cancels all past runs from the *triggered workflow* if any of the job names match any of the regular
|
||||
expressions. Note that it does not take into account the branch of the runs. It will cancel all runs
|
||||
with matching job names no mater the branch/repo.
|
||||
|
||||
This example is much more complex. It shows the actual case on how you can design your jobs using with
|
||||
using outputs from the cancel duplicate action and running subsequent cancel with namedJobs cancel
|
||||
mode. Hopefully in the future better solution will come from Github Actions and such cancel flow will
|
||||
be natively supported by GitHub Actions but as of now (August 2020) such native support is not
|
||||
possible. The example below uses specially named jobs that contain Branch, Repo and Run id of
|
||||
the triggering run. The cancel operation finds the runs that have jobs with the names following
|
||||
pattern containing the same repo and branch as the source run branch and repo in order to cancel duplicates.
|
||||
|
||||
In the case below, this workflow will first cancel the "CI" duplicate runs from the same branch and then
|
||||
it will cancel the runs from the Cancelling workflow which contain the same repo and branch as
|
||||
in job names, effectively implementing cancelling duplicate runs for the Cancelling workflow.
|
||||
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
cancel-duplicate-ci-runs:
|
||||
name: "Cancel duplicate CI runs"
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
sourceHeadRepo: ${{ steps.cancel.outputs.sourceHeadRepo }}
|
||||
sourceHeadBranch: ${{ steps.cancel.outputs.sourceHeadBranch }}
|
||||
sourceHeadSha: ${{ steps.cancel.outputs.sourceHeadSha }}
|
||||
sourceEvent: ${{ steps.cancel.outputs.sourceEvent }}
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
id: cancel
|
||||
name: "Cancel duplicate CI runs"
|
||||
with:
|
||||
cancelMode: duplicates
|
||||
cancelFutureDuplicates: true
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
notifyPRCancel: true
|
||||
notifyPRMessageStart: |
|
||||
Note! The Docker Images for the build are prepared in a separate workflow,
|
||||
that you will not see in the list of checks.
|
||||
|
||||
You can checks the status of those images in:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel duplicate Cancelling runs"
|
||||
with:
|
||||
cancelMode: namedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
notifyPRCancel: true
|
||||
jobNameRegexps: >
|
||||
["Build info
|
||||
repo: ${{ steps.cancel.outputs.sourceHeadRepo }}
|
||||
branch: ${{ steps.cancel.outputs.sourceHeadBranch }}.*"]
|
||||
|
||||
build-info:
|
||||
name: >
|
||||
Build info
|
||||
repo: ${{ needs.cancel-workflow-runs.outputs.sourceHeadRepo }}
|
||||
branch: ${{ needs.cancel-workflow-runs.outputs.sourceHeadBranch }}
|
||||
runs-on: ubuntu-latest
|
||||
needs: [cancel-duplicate-ci-runs]
|
||||
env:
|
||||
GITHUB_CONTEXT: ${{ toJson(github) }}
|
||||
steps:
|
||||
- name: >
|
||||
[${{ needs.cancel-workflow-runs.outputs.sourceEvent }}] will checkout
|
||||
Run id: ${{ github.run_id }}
|
||||
Source Run id: ${{ github.event.workflow_run.id }}
|
||||
Sha: ${{ needs.cancel-workflow-runs.outputs.sourceHeadSha }}
|
||||
Repo: ${{ needs.cancel-workflow-runs.outputs.sourceHeadRepo }}
|
||||
Branch: ${{ needs.cancel-workflow-runs.outputs.sourceHeadBranch }}
|
||||
run: |
|
||||
printenv
|
||||
```
|
||||
|
||||
|
||||
### Cancel the "self" source workflow run
|
||||
|
||||
This is useful in case you decide to cancel the *source run* that triggered the *triggered run*.
|
||||
In the case below, the step cancels the `CI` workflow that triggered the `Cancelling` run.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
cancel-self-source-workflow-run:
|
||||
name: "Cancel the self CI workflow run"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: "Cancel the self CI workflow run"
|
||||
uses: potiuk/cancel-workflow-runs@master
|
||||
with:
|
||||
cancelMode: self
|
||||
notifyPRCancel: true
|
||||
notifyPRCancelMessage: Cancelled because image building failed.
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
sourceRunId: ${{ github.event.workflow_run.id }}
|
||||
```
|
||||
|
||||
|
||||
### Cancel the "self" triggered workflow run
|
||||
|
||||
This is useful in case you decide to cancel the *triggered run*. The difference vs. previous case is that
|
||||
you do not specify the `sourceRunId` input.
|
||||
|
||||
In the case below - self workflow will be cancelled.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
cancel-self-cancelling-run:
|
||||
name: "Cancel the self Canceling workflow run"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: "Cancel the self Cancelling workflow run"
|
||||
uses: potiuk/cancel-workflow-runs@master
|
||||
with:
|
||||
cancelMode: self
|
||||
notifyPRCancel: true
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
```
|
||||
|
||||
Note that if you want to cancel both - source workflow and self workflow you need to first cancel
|
||||
the source workflow, and then cancel the self one, not the other way round :).
|
||||
|
||||
### Fail-fast source workflow runs with failed jobs
|
||||
|
||||
Cancels all runs from the *source workflow* if there are failed jobs matching any of the regular expressions.
|
||||
Note that the action does not take into account the branch/repos of the runs. It will cancel all runs
|
||||
with failed jobs no mater the branch/repo.
|
||||
|
||||
In the case below, if any of `CI` workflow runs (even with different branch heads) have failed jobs
|
||||
names matching `^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp - they
|
||||
will be cancelled.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
fail-fast-triggered-workflow-named-jobs-runs:
|
||||
name: "Fail fast CI runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Fail fast CI runs"
|
||||
with:
|
||||
cancelMode: failedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
sourceRunId: ${{ github.event.workflow_run.id }}
|
||||
notifyPRCancel: true
|
||||
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
|
||||
```
|
||||
|
||||
Note that if you not only want to cancel the failed triggering workflows but also
|
||||
the want to fail the corresponding "Cancelling" workflows, you need to implement the solution
|
||||
described in the next chapter.
|
||||
|
||||
### Fail-fast source workflow runs with failed jobs and corresponding triggered runs
|
||||
|
||||
Cancels all runs from the *source workflow* if there are failed jobs matching any of the regular expressions,
|
||||
also cancels the corresponding *triggered runs*.
|
||||
Note that the action does not take into account the branch/repos of the runs. It will cancel all runs
|
||||
with failed jobs no mater the branch/repo.
|
||||
|
||||
In the case below, if any of `CI` workflow runs (even with different branch heads) have failed jobs
|
||||
names matching `^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp - they
|
||||
will be cancelled as well as the corresponding "Cancelling" workflow runs.
|
||||
|
||||
There is no native support yet in GitHub actions to do it easily, so the example below shows how this can be
|
||||
achieved using `namedJobs` and output returned from the previous `Cancel Workflow Runs` action. Hopefull
|
||||
this will be simplified when GitHub Actions introduce native support for it.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
fail-fast-triggered-workflow-named-jobs-runs:
|
||||
name: "Fail fast CI runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Fail fast CI. Source run: ${{ github.event.workflow_run.id }}"
|
||||
id: cancel-failed
|
||||
with:
|
||||
cancelMode: failedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
sourceRunId: ${{ github.event.workflow_run.id }}
|
||||
notifyPRCancel: true
|
||||
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
|
||||
- name: "Extract canceled failed runs"
|
||||
id: extract-cancelled-failed-runs
|
||||
if: steps.cancel-failed.outputs.cancelledRuns != '[]'
|
||||
run: |
|
||||
REGEXP="Fail fast CI. Source run: "
|
||||
SEPARATOR=""
|
||||
for run_id in $(echo "${{ steps.cancel-failed.outputs.cancelledRuns }}" | jq '.[]')
|
||||
do
|
||||
REGEXP="${REGEXP}${SEPARATOR}(${run_id})"
|
||||
SEPARATOR="|"
|
||||
done
|
||||
echo "::set-output name=matching-regexp::${REGEXP}"
|
||||
- name: "Cancel triggered 'Cancelling' runs for the cancelled failed runs"
|
||||
if: steps.cancel-failed.outputs.cancelledRuns != '[]'
|
||||
uses: potiuk/cancel-workflow-runs@master
|
||||
with:
|
||||
cancelMode: namedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
notifyPRCancel: true
|
||||
jobNameRegexps: ${{ steps.extract-cancelled-failed.runs.matching-regexp }}
|
||||
|
||||
```
|
||||
|
||||
Note that if you not only want to cancel the failed triggering workflows but also
|
||||
the want to fail the corresponding "Cancelling" workflows, you need to implement the solution
|
||||
described in the next chapter.
|
||||
|
||||
### Fail-fast for triggered workflow runs with failed jobs
|
||||
|
||||
Cancels all runs from the *triggered workflow* if there are failed jobs matching any of the regular
|
||||
expressions. Note that it does not take into account the branch/repos of the runs. It will cancel all runs
|
||||
with failed jobs no mater the branch/repo.
|
||||
|
||||
In the case below, if any of `Cancelling` workflow runs (even with different branch heads) have failed jobs
|
||||
names matching `^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp - they
|
||||
will be cancelled.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
fail-fast-triggered-workflow-named-jobs-runs:
|
||||
name: "Fail fast Canceling runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Fail fast Canceling runs"
|
||||
with:
|
||||
cancelMode: failedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
|
||||
```
|
||||
|
||||
### Cancel another workflow run
|
||||
|
||||
This is useful in case you decide to cancel the *source run* that triggered the *triggered run*.
|
||||
In the case below, the step cancels the `CI` workflow that triggered the `Cancelling` run.
|
||||
|
||||
```yaml
|
||||
name: Cancelling
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
cancel-other-workflow-run:
|
||||
name: "Cancel the self CI workflow run"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: "Cancel the self CI workflow run"
|
||||
uses: potiuk/cancel-workflow-runs@master
|
||||
with:
|
||||
cancelMode: duplicates
|
||||
cancelFutureDuplicates: true
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
workflowFileName: other_workflow.yml
|
||||
```
|
||||
|
||||
### Cancel all duplicates for named jobs
|
||||
|
||||
Cancels all duplicated runs for all jobs that match specified regular expression.
|
||||
Note that it does not take into account the branch of the runs. It will cancel all duplicates with
|
||||
the same match for jobs, no matter what branch originated it.
|
||||
|
||||
This is useful in case of job names generated dynamically.
|
||||
|
||||
In the case below, for all the runs that have job names generated containing Branch/Repo/Event combination
|
||||
that have the same match, the duplicates will get cancelled leaving only the most recent run for each exact
|
||||
match.
|
||||
|
||||
Note that the match must be identical. If there are two jobs that have a different Branch
|
||||
they will both match the same pattern, but they are not considered duplicates.
|
||||
|
||||
Also, this is one of the jobs It has also self-preservation turned off.
|
||||
This means that in case the job determines that it is itself a duplicate it will cancel itself. That's
|
||||
why checking for duplicates of self-workflow should be the last step in the cancelling process.
|
||||
|
||||
|
||||
```yaml
|
||||
on:
|
||||
push:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
cancel-self-failed-runs:
|
||||
name: "Cancel the self workflow run"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel past CI runs"
|
||||
with:
|
||||
cancelMode: allDuplicatedNamedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
jobNameRegexps: '["Branch: .* Repo: .* Event: .* "]'
|
||||
selfPreservation: false
|
||||
notifyPRCancel: true
|
||||
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Repositories that do not use Pull Requests from forks
|
||||
|
||||
Note that examples in this chapter only work if you do not have Pull Requests coming from forks (so for
|
||||
example if you only work in a private repository). When those action runs within the usual `pull_request`
|
||||
triggered runs coming from a fork, they have not enough permissions to cancel running workflows.
|
||||
|
||||
If you want to cancel `pull_requests` from forks, you need to use `workflow_run` triggered runs - see the
|
||||
[Repositories that use Pull Requests from fork](#repositories-that-use-pull-requests-from-forks) chapter.
|
||||
|
||||
Note that in case you configure the separate `workflow_run` Cancelling workflow, there is no need to add
|
||||
the action to the "source" workflows. The "Canceling workflow" pattern handles well not only Pull Requests
|
||||
from the forks, but also all other cases - including cancelling Pull Requests for the same repository
|
||||
and canceling scheduled runs.
|
||||
|
||||
### Cancel duplicate runs for "self" workflow
|
||||
|
||||
Cancels past runs for the same workflow (with the same branch).
|
||||
|
||||
In the case below, any of the direct "push" events will cancel all past runs for the same branch as the
|
||||
one being pushed. However, it can be configured for "pull_request" (in the same repository) or "schedule"
|
||||
type of events as well. It will also notify the PR with the comment containining why it has been
|
||||
cancelled.
|
||||
|
||||
```yaml
|
||||
name: CI
|
||||
on: push
|
||||
jobs:
|
||||
cancel-duplicate-workflow-runs:
|
||||
name: "Cancel duplicate workflow runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel duplicate workflow runs"
|
||||
with:
|
||||
cancelMode: duplicates
|
||||
cancelFutureDuplicates: true
|
||||
notifyPRCancel: true
|
||||
```
|
||||
|
||||
### Cancel "self" workflow run
|
||||
|
||||
This is useful in case you decide to cancel "self" run.
|
||||
|
||||
In the case below - own workflow will be cancelled immediately. It can be configured for "push",
|
||||
"pull_request" (from the same repository) or "schedule" type of events.
|
||||
|
||||
```yaml
|
||||
name: CI
|
||||
on: push
|
||||
jobs:
|
||||
cancel-self-run:
|
||||
name: "Cancel the self workflow run"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: "Cancel the self workflow run"
|
||||
uses: potiuk/cancel-workflow-runs@master
|
||||
with:
|
||||
cancelMode: self
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
notifyPRCancel: true
|
||||
```
|
||||
|
||||
### Fail-fast workflow runs with failed jobs
|
||||
|
||||
Cancels all runs (including self run!) if they have failed jobs matching any of the regular expressions.
|
||||
Note that it does not take into account the branch of the running jobs. It will cancel all runs with failed
|
||||
jobs, no matter what branch originated it.
|
||||
|
||||
In the case below, if any of the own workflow runs have failed jobs matching any of the
|
||||
`^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp, this workflow will cancel the runs.
|
||||
|
||||
```yaml
|
||||
name: CI
|
||||
on:
|
||||
push:
|
||||
|
||||
jobs:
|
||||
cancel-self-failed-runs:
|
||||
name: "Cancel failed runs"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel failed runs"
|
||||
with:
|
||||
cancelMode: failedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
|
||||
notifyPRCancel: true
|
||||
```
|
||||
|
||||
### Cancel all runs with named jobs
|
||||
|
||||
Cancels all runs (including self run!) if any of the job names match any of the regular
|
||||
expressions. Note that it does not take into account the branch of the runs. It will cancel all runs with
|
||||
matching jobs, no matter what branch originated it.
|
||||
|
||||
This is useful in case of job names generated dynamically.
|
||||
|
||||
In the case below, if any of the "self" workflow runs has job names that matches any of the
|
||||
`^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp, this workflow will cancel the runs.
|
||||
|
||||
```yaml
|
||||
on:
|
||||
push:
|
||||
workflow_run:
|
||||
workflows: ['CI']
|
||||
types: ['requested']
|
||||
|
||||
jobs:
|
||||
cancel-self-failed-runs:
|
||||
name: "Cancel the self workflow run"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: potiuk/cancel-workflow-runs@master
|
||||
name: "Cancel past CI runs"
|
||||
with:
|
||||
cancelMode: namedJobs
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
|
||||
notifyPRCancel: true
|
||||
```
|
||||
|
||||
|
||||
## Development environment
|
||||
|
||||
It is highly recommended tu use [pre commit](https://pre-commit.com). The pre-commits
|
||||
installed via pre-commit tool handle automatically linting (including automated fixes) as well
|
||||
as building and packaging Javascript index.js from the main.ts Typescript code, so you do not have
|
||||
to run it yourself.
|
||||
|
||||
## License
|
||||
[MIT License](LICENSE) covers the scripts and documentation in this project.
|
||||
5
.github/actions/cancel-workflow-runs/__tests__/main.test.ts
vendored
Normal file
5
.github/actions/cancel-workflow-runs/__tests__/main.test.ts
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import * as process from 'process'
|
||||
import * as cp from 'child_process'
|
||||
import * as path from 'path'
|
||||
|
||||
test('no op', () => {})
|
||||
74
.github/actions/cancel-workflow-runs/action.yml
vendored
Normal file
74
.github/actions/cancel-workflow-runs/action.yml
vendored
Normal file
@@ -0,0 +1,74 @@
|
||||
name: 'Cancel Workflow Runs'
|
||||
description: 'Cancel Workflow Runs - duplicates, failed, named in order to limit job usage,'
|
||||
author: 'potiuk'
|
||||
inputs:
|
||||
token:
|
||||
description: The GITHUB_TOKEN secret of the repository
|
||||
required: true
|
||||
sourceRunId:
|
||||
description: |
|
||||
The run that triggered the action. It should be set to
|
||||
`$\{\{ github.event.workflow_run.id` variable \}\}` if used in `workflow_run` triggered run if
|
||||
you want to act on source workflow rather than the triggered run.
|
||||
required: false
|
||||
notifyPRCancel:
|
||||
description: |
|
||||
Boolean. If set to true, it notifies the cancelled PRs with a comment containing reason why
|
||||
they are being cancelled.
|
||||
required: false
|
||||
notifyPRCancelMessage:
|
||||
description: |
|
||||
Optional cancel message to use instead of the default one when notifyPRCancel is true. Only
|
||||
used in 'self' cancel mode.
|
||||
required: false
|
||||
notifyPRMessageStart:
|
||||
description: |
|
||||
Only for workflow_run events triggered by the PRs. If not empty, it notifies those PRs with the
|
||||
message specified at the start of the workflow - adding the link to the triggered workflow_run.
|
||||
required: false
|
||||
cancelMode:
|
||||
description: |
|
||||
The mode of cancel. One of:
|
||||
* `duplicates` - cancels duplicate runs from the same repo/branch as local run or
|
||||
sourceId workflow. This is the default mode when cancelMode is not specified.
|
||||
* `allDuplicates` - cancels duplicate runs from all workflows. It is more aggressive version of
|
||||
duplicate canceling - as it cancels all duplicates. It is helpful in case
|
||||
of long queues of builds - as it is enough that one of the workflows that
|
||||
cancel duplicates is executed, it can effectively clean-up the queue in this
|
||||
case for all the future, queued runs.
|
||||
* `self` - cancels self run - either own run if sourceRunId is not set, or
|
||||
the source run that triggered the `workflow_run'
|
||||
* `failedJobs` - cancels all runs that failed in jobs matching one of the regexps
|
||||
* `namedJobs` - cancels runs where names of some jobs match some of regexps
|
||||
required: false
|
||||
cancelFutureDuplicates:
|
||||
description: |
|
||||
In case of duplicate canceling, cancel also future duplicates leaving only the "freshest" running
|
||||
job and not all the future jobs. By default it is set to true.
|
||||
required: false
|
||||
selfPreservation:
|
||||
description: |
|
||||
Do not cancel your own run. There are cases where selfPreservation should be disabled but it is
|
||||
enabled by default. You can disable it by setting 'false' as value.
|
||||
required: false
|
||||
jobNameRegexps:
|
||||
description: |
|
||||
Array of job name regexps (JSON-encoded string). Used by `failedJobs` and `namedJobs` cancel modes
|
||||
to match job names of workflow runs.
|
||||
required: false
|
||||
skipEventTypes:
|
||||
description: |
|
||||
Array of event names that should be skipped when cancelling (JSON-encoded string). This might be used
|
||||
in order to skip direct pushes or scheduled events.
|
||||
required: false
|
||||
workflowFileName:
|
||||
description: |
|
||||
Name of the workflow file. It can be used if you want to cancel a different workflow than yours.
|
||||
required: false
|
||||
|
||||
runs:
|
||||
using: 'node12'
|
||||
main: 'dist/index.js'
|
||||
branding:
|
||||
icon: 'play'
|
||||
color: 'blue'
|
||||
11
.github/actions/cancel-workflow-runs/jest.config.js
vendored
Normal file
11
.github/actions/cancel-workflow-runs/jest.config.js
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
module.exports = {
|
||||
clearMocks: true,
|
||||
moduleFileExtensions: ['js', 'ts'],
|
||||
testEnvironment: 'node',
|
||||
testMatch: ['**/*.test.ts'],
|
||||
testRunner: 'jest-circus/runner',
|
||||
transform: {
|
||||
'^.+\\.ts$': 'ts-jest'
|
||||
},
|
||||
verbose: true
|
||||
}
|
||||
10921
.github/actions/cancel-workflow-runs/package-lock.json
generated
vendored
Normal file
10921
.github/actions/cancel-workflow-runs/package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
48
.github/actions/cancel-workflow-runs/package.json
vendored
Normal file
48
.github/actions/cancel-workflow-runs/package.json
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
{
|
||||
"name": "typescript-action",
|
||||
"version": "0.0.0",
|
||||
"private": true,
|
||||
"description": "TypeScript template action",
|
||||
"main": "lib/main.js",
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"format": "prettier --write **/*.ts",
|
||||
"format-check": "prettier --check **/*.ts",
|
||||
"lint": "eslint src/**/*.ts",
|
||||
"pack": "ncc build",
|
||||
"test": "jest",
|
||||
"all": "npm run build && npm run format && npm run lint && npm run pack && npm test",
|
||||
"release": "ncc build -o dist src/main.ts"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/actions/typescript-action.git"
|
||||
},
|
||||
"keywords": [
|
||||
"actions",
|
||||
"node",
|
||||
"setup"
|
||||
],
|
||||
"author": "YourNameOrOrganization",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@actions/core": "^1.2.2",
|
||||
"@actions/github": "^2.1.0",
|
||||
"jstreemap": "^1.28.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/jest": "^24.0.23",
|
||||
"@types/node": "^12.7.12",
|
||||
"@typescript-eslint/parser": "^2.8.0",
|
||||
"@zeit/ncc": "^0.20.5",
|
||||
"eslint": "^5.16.0",
|
||||
"eslint-plugin-github": "^2.0.0",
|
||||
"eslint-plugin-jest": "^22.21.0",
|
||||
"jest": "^26.2.2",
|
||||
"jest-circus": "^26.2.2",
|
||||
"js-yaml": "^3.13.1",
|
||||
"prettier": "^1.19.1",
|
||||
"ts-jest": "^26.1.4",
|
||||
"typescript": "^3.6.4"
|
||||
}
|
||||
}
|
||||
1519
.github/actions/cancel-workflow-runs/src/main.ts
vendored
Normal file
1519
.github/actions/cancel-workflow-runs/src/main.ts
vendored
Normal file
File diff suppressed because it is too large
Load Diff
12
.github/actions/cancel-workflow-runs/tsconfig.json
vendored
Normal file
12
.github/actions/cancel-workflow-runs/tsconfig.json
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "es6", /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
|
||||
"module": "commonjs", /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */
|
||||
"outDir": "./lib", /* Redirect output structure to the directory. */
|
||||
"rootDir": "./src", /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
"noImplicitAny": true, /* Raise error on expressions and declarations with an implied 'any' type. */
|
||||
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */
|
||||
},
|
||||
"exclude": ["node_modules", "**/*.test.ts"]
|
||||
}
|
||||
22
.github/actions/cancel-workflow-runs/yamllint-config.yml
vendored
Normal file
22
.github/actions/cancel-workflow-runs/yamllint-config.yml
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
---
|
||||
extends: default
|
||||
|
||||
rules:
|
||||
line-length:
|
||||
max: 110
|
||||
34
.github/actions/change-detector/action.yml
vendored
34
.github/actions/change-detector/action.yml
vendored
@@ -1,34 +0,0 @@
|
||||
name: Change Detector
|
||||
description: Detects file changes for pull request and push events
|
||||
inputs:
|
||||
token:
|
||||
description: GitHub token for authentication
|
||||
required: true
|
||||
outputs:
|
||||
python:
|
||||
description: Whether Python-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.python }}
|
||||
frontend:
|
||||
description: Whether frontend-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.frontend }}
|
||||
docker:
|
||||
description: Whether docker-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.docker }}
|
||||
docs:
|
||||
description: Whether docs-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.docs }}
|
||||
superset-extensions-cli:
|
||||
description: Whether superset-extensions-cli package-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.superset-extensions-cli }}
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect file changes
|
||||
id: change-detector
|
||||
run: |
|
||||
python --version
|
||||
python scripts/change_detector.py
|
||||
shell: bash
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ inputs.token }}
|
||||
GITHUB_OUTPUT: ${{ github.output }}
|
||||
@@ -1,23 +0,0 @@
|
||||
name: Label Draft PRs
|
||||
on:
|
||||
pull_request:
|
||||
types:
|
||||
- opened
|
||||
- converted_to_draft
|
||||
jobs:
|
||||
label-draft:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Check if the PR is a draft
|
||||
id: check-draft
|
||||
uses: actions/github-script@v6
|
||||
with:
|
||||
script: |
|
||||
const isDraft = context.payload.pull_request.draft;
|
||||
core.setOutput('isDraft', isDraft);
|
||||
- name: Add `review:draft` Label
|
||||
if: steps.check-draft.outputs.isDraft == 'true'
|
||||
uses: actions-ecosystem/action-add-labels@v1
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
labels: "review:draft"
|
||||
1
.github/actions/chart-releaser-action
vendored
1
.github/actions/chart-releaser-action
vendored
Submodule .github/actions/chart-releaser-action deleted from a917fd15b2
1
.github/actions/chart-testing-action
vendored
1
.github/actions/chart-testing-action
vendored
Submodule .github/actions/chart-testing-action deleted from afea100a51
1
.github/actions/comment-on-pr
vendored
1
.github/actions/comment-on-pr
vendored
Submodule .github/actions/comment-on-pr deleted from 85a56be792
13
.github/actions/comment-on-pr/Dockerfile
vendored
Normal file
13
.github/actions/comment-on-pr/Dockerfile
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
FROM ruby:2.6.0
|
||||
|
||||
LABEL "com.github.actions.name"="Comment on PR"
|
||||
LABEL "com.github.actions.description"="Leaves a comment on an open PR matching a push event."
|
||||
LABEL "com.github.actions.repository"="https://github.com/unsplash/comment-on-pr"
|
||||
LABEL "com.github.actions.maintainer"="Aaron Klaassen <aaron@unsplash.com>"
|
||||
LABEL "com.github.actions.icon"="message-square"
|
||||
LABEL "com.github.actions.color"="blue"
|
||||
|
||||
RUN gem install octokit
|
||||
|
||||
ADD entrypoint.sh /entrypoint.sh
|
||||
ENTRYPOINT ["/entrypoint.sh"]
|
||||
7
.github/actions/comment-on-pr/LICENSE
vendored
Normal file
7
.github/actions/comment-on-pr/LICENSE
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
Copyright 2019 Unsplash Inc.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
27
.github/actions/comment-on-pr/README.md
vendored
Normal file
27
.github/actions/comment-on-pr/README.md
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
# Comment on PR via GitHub Action
|
||||
|
||||
A GitHub action to comment on the relevant open PR when a commit is pushed.
|
||||
|
||||
## Usage
|
||||
|
||||
- Requires the `GITHUB_TOKEN` secret.
|
||||
- Requires the comment's message in the `msg` parameter.
|
||||
- Supports `push` and `pull_request` event types.
|
||||
|
||||
### Sample workflow
|
||||
|
||||
```
|
||||
name: comment-on-pr example
|
||||
on: pull_request
|
||||
jobs:
|
||||
example:
|
||||
name: sample comment
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: comment PR
|
||||
uses: unsplash/comment-on-pr@master
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
msg: "Check out this message!"
|
||||
```
|
||||
15
.github/actions/comment-on-pr/action.yml
vendored
Normal file
15
.github/actions/comment-on-pr/action.yml
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
name: Comment on PR
|
||||
author: Aaron Klaassen <aaron@unsplash.com>
|
||||
description: Leaves a comment on an open PR matching a push event.
|
||||
branding:
|
||||
icon: 'message-square'
|
||||
color: 'blue'
|
||||
inputs:
|
||||
msg:
|
||||
description: Comment's message
|
||||
required: true
|
||||
runs:
|
||||
using: 'docker'
|
||||
image: 'Dockerfile'
|
||||
args:
|
||||
- ${{ inputs.msg }}
|
||||
47
.github/actions/comment-on-pr/entrypoint.sh
vendored
Executable file
47
.github/actions/comment-on-pr/entrypoint.sh
vendored
Executable file
@@ -0,0 +1,47 @@
|
||||
#!/usr/bin/env ruby
|
||||
|
||||
require "json"
|
||||
require "octokit"
|
||||
|
||||
json = File.read(ENV.fetch("GITHUB_EVENT_PATH"))
|
||||
event = JSON.parse(json)
|
||||
|
||||
github = Octokit::Client.new(access_token: ENV["GITHUB_TOKEN"])
|
||||
|
||||
if !ENV["GITHUB_TOKEN"]
|
||||
puts "Missing GITHUB_TOKEN"
|
||||
exit(1)
|
||||
end
|
||||
|
||||
if ARGV.empty?
|
||||
puts "Missing message argument."
|
||||
exit(1)
|
||||
end
|
||||
|
||||
repo = event["repository"]["full_name"]
|
||||
|
||||
if ENV.fetch("GITHUB_EVENT_NAME") == "pull_request"
|
||||
pr_number = event["number"]
|
||||
else
|
||||
pulls = github.pull_requests(repo, state: "open")
|
||||
|
||||
push_head = event["after"]
|
||||
pr = pulls.find { |pr| pr["head"]["sha"] == push_head }
|
||||
|
||||
if !pr
|
||||
puts "Couldn't find an open pull request for branch with head at #{push_head}."
|
||||
exit(1)
|
||||
end
|
||||
pr_number = pr["number"]
|
||||
end
|
||||
message = ARGV.join(' ')
|
||||
|
||||
coms = github.issue_comments(repo, pr_number)
|
||||
duplicate = coms.find { |c| c["user"]["login"] == "github-actions[bot]" && c["body"] == message }
|
||||
|
||||
if duplicate
|
||||
puts "The PR already contains a database change notification"
|
||||
exit(0)
|
||||
end
|
||||
|
||||
github.add_comment(repo, pr_number, message)
|
||||
1
.github/actions/file-changes-action
vendored
1
.github/actions/file-changes-action
vendored
Submodule .github/actions/file-changes-action deleted from a6ca26c142
55
.github/actions/file-changes-action/.codecov.yml
vendored
Normal file
55
.github/actions/file-changes-action/.codecov.yml
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
codecov:
|
||||
notify:
|
||||
require_ci_to_pass: yes
|
||||
|
||||
coverage:
|
||||
notify:
|
||||
slack:
|
||||
default:
|
||||
threshold: 1%
|
||||
message: "Coverage {{changed}} for {{owner}}/{{repo}}" # customize the message
|
||||
attachments: "sunburst, diff"
|
||||
only_pulls: false
|
||||
status:
|
||||
src:
|
||||
target: auto
|
||||
threshold: 7%
|
||||
base: auto
|
||||
if_ci_failed: success
|
||||
paths:
|
||||
- src/
|
||||
- '!src/tests/'
|
||||
flags:
|
||||
- src
|
||||
test:
|
||||
target: 60%
|
||||
threshold: 10%
|
||||
if_ci_failed: error
|
||||
base: auto
|
||||
paths:
|
||||
- src/tests/
|
||||
flags:
|
||||
- test
|
||||
precision: 2
|
||||
round: down
|
||||
range: "70...100"
|
||||
flags:
|
||||
src:
|
||||
paths:
|
||||
- src
|
||||
- '!src/tests/'
|
||||
test:
|
||||
paths:
|
||||
- src/tests/
|
||||
parsers:
|
||||
gcov:
|
||||
branch_detection:
|
||||
conditional: yes
|
||||
loop: yes
|
||||
method: no
|
||||
macro: no
|
||||
|
||||
comment:
|
||||
layout: "reach,diff,flags,tree"
|
||||
behavior: default
|
||||
require_changes: no
|
||||
73
.github/actions/file-changes-action/.eslintrc.yml
vendored
Normal file
73
.github/actions/file-changes-action/.eslintrc.yml
vendored
Normal file
@@ -0,0 +1,73 @@
|
||||
plugins:
|
||||
- '@typescript-eslint'
|
||||
- eslint-comments
|
||||
- promise
|
||||
- unicorn
|
||||
extends:
|
||||
- airbnb-typescript
|
||||
- plugin:@typescript-eslint/recommended
|
||||
- plugin:eslint-comments/recommended
|
||||
- plugin:promise/recommended
|
||||
- plugin:unicorn/recommended
|
||||
- prettier
|
||||
- prettier/@typescript-eslint
|
||||
settings:
|
||||
import/parsers:
|
||||
'@typescript-eslint/parser':
|
||||
- .ts
|
||||
- .tsx
|
||||
- .js
|
||||
import/resolver:
|
||||
typescript: {}
|
||||
rules:
|
||||
unicorn/filename-case: off
|
||||
react/static-property-placement: 0
|
||||
no-prototype-builtins: 0
|
||||
import/prefer-default-export: 0
|
||||
'@typescript-eslint/no-explicit-any': 0
|
||||
import/no-default-export: error
|
||||
no-use-before-define:
|
||||
- error
|
||||
-
|
||||
functions: false
|
||||
classes: true
|
||||
variables: true
|
||||
'@typescript-eslint/explicit-function-return-type':
|
||||
- error
|
||||
-
|
||||
allowExpressions: true
|
||||
allowTypedFunctionExpressions: true
|
||||
'@typescript-eslint/no-use-before-define':
|
||||
- error
|
||||
-
|
||||
functions: false
|
||||
classes: true
|
||||
variables: true
|
||||
typedefs: true
|
||||
'@typescript-eslint/indent':
|
||||
- 2
|
||||
- 2
|
||||
unicorn/prevent-abbreviations: 0
|
||||
import/no-extraneous-dependencies: [error, {devDependencies: ['**/*.ts']}]
|
||||
parser: "@typescript-eslint/parser"
|
||||
parserOptions:
|
||||
project: ./tsconfig.json
|
||||
ecmaVersion: 2019
|
||||
sourceType: module
|
||||
env:
|
||||
node: true
|
||||
browser: true
|
||||
ignorePatterns:
|
||||
- '*.js'
|
||||
overrides:
|
||||
- files: ['src/tests/**/*']
|
||||
plugins:
|
||||
- jest
|
||||
extends:
|
||||
- plugin:jest/recommended
|
||||
rules:
|
||||
global-require: 0
|
||||
'@typescript-eslint/no-var-requires': 0
|
||||
no-console: 0
|
||||
'@typescript-eslint/no-unused-vars': 0
|
||||
'@typescript-eslint/no-throw-literal': 0
|
||||
3
.github/actions/file-changes-action/.github/CONTRIBUTING.md
vendored
Normal file
3
.github/actions/file-changes-action/.github/CONTRIBUTING.md
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
# Contributing
|
||||
|
||||
The repository is released under the MIT license, and follows a standard Github development process, using Github tracker for issues and merging pull requests into master.
|
||||
17
.github/actions/file-changes-action/.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
17
.github/actions/file-changes-action/.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
|
||||
---
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
**Workflow**
|
||||
If applicable, provide a workflow file to help explain your problem.
|
||||
|
||||
**Expected behavior**
|
||||
A clear and concise description of what you expected to happen.
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here.
|
||||
17
.github/actions/file-changes-action/.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
17
.github/actions/file-changes-action/.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
14
.github/actions/file-changes-action/.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
14
.github/actions/file-changes-action/.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
### Type of Change
|
||||
<!-- What type of change does your code introduce? -->
|
||||
- [ ] New feature
|
||||
- [ ] Bug fix
|
||||
- [ ] Documentation
|
||||
- [ ] Refactor
|
||||
- [ ] Chore
|
||||
|
||||
### Resolves
|
||||
- Fixes #[Add issue number here.]
|
||||
|
||||
### Describe Changes
|
||||
<!-- Describe your changes in detail, if applicable. -->
|
||||
_Describe what this Pull Request does_
|
||||
1
.github/actions/file-changes-action/.github/actions/integration/events/files.csv
vendored
Normal file
1
.github/actions/file-changes-action/.github/actions/integration/events/files.csv
vendored
Normal file
@@ -0,0 +1 @@
|
||||
.codecov.yml,.eslintignore,.eslintrc.json,.eslintrc.yml,.github/workflows/integration.yml,.github/workflows/pr.yml,.github/workflows/push.yml,.github/workflows/readme.md,.gitignore,.prettierignore,.prettierrc.json,.prettierrc.yml,.releaserc.yml,Makefile,README.md,__tests__/main.test.ts,action.yml,dist/index.js,jest.config.js,package.json,src/ChangedFiles.ts,src/File.ts,src/FilesHelper.ts,src/GithubHelper.ts,src/InputHelper.ts,src/UtilsHelper.ts,src/main.ts,src/tests/FilesHelper.test.ts,src/tests/GithubHelper.test.ts,src/tests/InputHelper.test.ts,src/tests/UtilsHelper.test.ts,src/tests/main.test.ts,src/tests/mocks/core/index.test.ts,src/tests/mocks/core/index.ts,src/tests/mocks/env/events/issue_comment_created.json,src/tests/mocks/env/events/issue_comment_edited.json,src/tests/mocks/env/events/pull_request_opened.json,src/tests/mocks/env/events/pull_request_reopened.json,src/tests/mocks/env/events/pull_request_synchronize.json,src/tests/mocks/env/events/push.json,src/tests/mocks/env/events/push_merge.json,src/tests/mocks/env/events/schedule.json,src/tests/mocks/env/index.test.ts,src/tests/mocks/env/index.ts,src/tests/mocks/fs/index.test.ts,src/tests/mocks/fs/index.ts,src/tests/mocks/github/index.test.ts,src/tests/mocks/github/index.ts,src/tests/mocks/octokit/endpoint/merge.test.ts,src/tests/mocks/octokit/endpoint/merge.ts,src/tests/mocks/octokit/index.test.ts,src/tests/mocks/octokit/index.ts,src/tests/mocks/octokit/paginate.test.ts,src/tests/mocks/octokit/paginate.ts,src/tests/mocks/octokit/payloads.ts,src/tests/mocks/octokit/pulls/listFiles.test.ts,src/tests/mocks/octokit/pulls/listFiles.ts,src/tests/mocks/octokit/repos/compareCommits.test.ts,src/tests/mocks/octokit/repos/compareCommits.ts,src/tests/payloads.ts,src/typings/ActionError/index.d.ts,src/typings/ChangedFiles/index.d.ts,src/typings/CoreMock/index.d.ts,src/typings/FsMock/index.d.ts,src/typings/GitHubFile/index.d.ts,src/typings/GitHubMock/index.d.ts,src/typings/Inferred/index.d.ts,src/typings/Inputs/index.d.ts,src/typings/OctokitMock/index.d.ts,src/typings/TestInput/index.d.ts,tsconfig.build.json,tsconfig.json,yarn.lock
|
||||
|
75
.github/actions/file-changes-action/.github/actions/integration/events/files.json
vendored
Normal file
75
.github/actions/file-changes-action/.github/actions/integration/events/files.json
vendored
Normal file
@@ -0,0 +1,75 @@
|
||||
[
|
||||
".codecov.yml",
|
||||
".eslintignore",
|
||||
".eslintrc.json",
|
||||
".eslintrc.yml",
|
||||
".github/workflows/integration.yml",
|
||||
".github/workflows/pr.yml",
|
||||
".github/workflows/push.yml",
|
||||
".github/workflows/readme.md",
|
||||
".gitignore",
|
||||
".prettierignore",
|
||||
".prettierrc.json",
|
||||
".prettierrc.yml",
|
||||
".releaserc.yml",
|
||||
"Makefile",
|
||||
"README.md",
|
||||
"__tests__/main.test.ts",
|
||||
"action.yml",
|
||||
"dist/index.js",
|
||||
"jest.config.js",
|
||||
"package.json",
|
||||
"src/ChangedFiles.ts",
|
||||
"src/File.ts",
|
||||
"src/FilesHelper.ts",
|
||||
"src/GithubHelper.ts",
|
||||
"src/InputHelper.ts",
|
||||
"src/UtilsHelper.ts",
|
||||
"src/main.ts",
|
||||
"src/tests/FilesHelper.test.ts",
|
||||
"src/tests/GithubHelper.test.ts",
|
||||
"src/tests/InputHelper.test.ts",
|
||||
"src/tests/UtilsHelper.test.ts",
|
||||
"src/tests/main.test.ts",
|
||||
"src/tests/mocks/core/index.test.ts",
|
||||
"src/tests/mocks/core/index.ts",
|
||||
"src/tests/mocks/env/events/issue_comment_created.json",
|
||||
"src/tests/mocks/env/events/issue_comment_edited.json",
|
||||
"src/tests/mocks/env/events/pull_request_opened.json",
|
||||
"src/tests/mocks/env/events/pull_request_reopened.json",
|
||||
"src/tests/mocks/env/events/pull_request_synchronize.json",
|
||||
"src/tests/mocks/env/events/push.json",
|
||||
"src/tests/mocks/env/events/push_merge.json",
|
||||
"src/tests/mocks/env/events/schedule.json",
|
||||
"src/tests/mocks/env/index.test.ts",
|
||||
"src/tests/mocks/env/index.ts",
|
||||
"src/tests/mocks/fs/index.test.ts",
|
||||
"src/tests/mocks/fs/index.ts",
|
||||
"src/tests/mocks/github/index.test.ts",
|
||||
"src/tests/mocks/github/index.ts",
|
||||
"src/tests/mocks/octokit/endpoint/merge.test.ts",
|
||||
"src/tests/mocks/octokit/endpoint/merge.ts",
|
||||
"src/tests/mocks/octokit/index.test.ts",
|
||||
"src/tests/mocks/octokit/index.ts",
|
||||
"src/tests/mocks/octokit/paginate.test.ts",
|
||||
"src/tests/mocks/octokit/paginate.ts",
|
||||
"src/tests/mocks/octokit/payloads.ts",
|
||||
"src/tests/mocks/octokit/pulls/listFiles.test.ts",
|
||||
"src/tests/mocks/octokit/pulls/listFiles.ts",
|
||||
"src/tests/mocks/octokit/repos/compareCommits.test.ts",
|
||||
"src/tests/mocks/octokit/repos/compareCommits.ts",
|
||||
"src/tests/payloads.ts",
|
||||
"src/typings/ActionError/index.d.ts",
|
||||
"src/typings/ChangedFiles/index.d.ts",
|
||||
"src/typings/CoreMock/index.d.ts",
|
||||
"src/typings/FsMock/index.d.ts",
|
||||
"src/typings/GitHubFile/index.d.ts",
|
||||
"src/typings/GitHubMock/index.d.ts",
|
||||
"src/typings/Inferred/index.d.ts",
|
||||
"src/typings/Inputs/index.d.ts",
|
||||
"src/typings/OctokitMock/index.d.ts",
|
||||
"src/typings/TestInput/index.d.ts",
|
||||
"tsconfig.build.json",
|
||||
"tsconfig.json",
|
||||
"yarn.lock"
|
||||
]
|
||||
1
.github/actions/file-changes-action/.github/actions/integration/events/files.txt
vendored
Normal file
1
.github/actions/file-changes-action/.github/actions/integration/events/files.txt
vendored
Normal file
@@ -0,0 +1 @@
|
||||
.codecov.yml .eslintignore .eslintrc.json .eslintrc.yml .github/workflows/integration.yml .github/workflows/pr.yml .github/workflows/push.yml .github/workflows/readme.md .gitignore .prettierignore .prettierrc.json .prettierrc.yml .releaserc.yml Makefile README.md __tests__/main.test.ts action.yml dist/index.js jest.config.js package.json src/ChangedFiles.ts src/File.ts src/FilesHelper.ts src/GithubHelper.ts src/InputHelper.ts src/UtilsHelper.ts src/main.ts src/tests/FilesHelper.test.ts src/tests/GithubHelper.test.ts src/tests/InputHelper.test.ts src/tests/UtilsHelper.test.ts src/tests/main.test.ts src/tests/mocks/core/index.test.ts src/tests/mocks/core/index.ts src/tests/mocks/env/events/issue_comment_created.json src/tests/mocks/env/events/issue_comment_edited.json src/tests/mocks/env/events/pull_request_opened.json src/tests/mocks/env/events/pull_request_reopened.json src/tests/mocks/env/events/pull_request_synchronize.json src/tests/mocks/env/events/push.json src/tests/mocks/env/events/push_merge.json src/tests/mocks/env/events/schedule.json src/tests/mocks/env/index.test.ts src/tests/mocks/env/index.ts src/tests/mocks/fs/index.test.ts src/tests/mocks/fs/index.ts src/tests/mocks/github/index.test.ts src/tests/mocks/github/index.ts src/tests/mocks/octokit/endpoint/merge.test.ts src/tests/mocks/octokit/endpoint/merge.ts src/tests/mocks/octokit/index.test.ts src/tests/mocks/octokit/index.ts src/tests/mocks/octokit/paginate.test.ts src/tests/mocks/octokit/paginate.ts src/tests/mocks/octokit/payloads.ts src/tests/mocks/octokit/pulls/listFiles.test.ts src/tests/mocks/octokit/pulls/listFiles.ts src/tests/mocks/octokit/repos/compareCommits.test.ts src/tests/mocks/octokit/repos/compareCommits.ts src/tests/payloads.ts src/typings/ActionError/index.d.ts src/typings/ChangedFiles/index.d.ts src/typings/CoreMock/index.d.ts src/typings/FsMock/index.d.ts src/typings/GitHubFile/index.d.ts src/typings/GitHubMock/index.d.ts src/typings/Inferred/index.d.ts src/typings/Inputs/index.d.ts src/typings/OctokitMock/index.d.ts src/typings/TestInput/index.d.ts tsconfig.build.json tsconfig.json yarn.lock
|
||||
1
.github/actions/file-changes-action/.github/actions/integration/events/files_added.csv
vendored
Normal file
1
.github/actions/file-changes-action/.github/actions/integration/events/files_added.csv
vendored
Normal file
@@ -0,0 +1 @@
|
||||
.codecov.yml,.eslintrc.yml,.prettierrc.yml,.releaserc.yml,src/FilesHelper.ts,src/GithubHelper.ts,src/InputHelper.ts,src/UtilsHelper.ts,src/tests/FilesHelper.test.ts,src/tests/GithubHelper.test.ts,src/tests/InputHelper.test.ts,src/tests/UtilsHelper.test.ts,src/tests/main.test.ts,src/tests/mocks/core/index.test.ts,src/tests/mocks/core/index.ts,src/tests/mocks/env/events/issue_comment_created.json,src/tests/mocks/env/events/issue_comment_edited.json,src/tests/mocks/env/events/pull_request_opened.json,src/tests/mocks/env/events/pull_request_reopened.json,src/tests/mocks/env/events/pull_request_synchronize.json,src/tests/mocks/env/events/push.json,src/tests/mocks/env/events/push_merge.json,src/tests/mocks/env/events/schedule.json,src/tests/mocks/env/index.test.ts,src/tests/mocks/env/index.ts,src/tests/mocks/fs/index.test.ts,src/tests/mocks/fs/index.ts,src/tests/mocks/github/index.test.ts,src/tests/mocks/github/index.ts,src/tests/mocks/octokit/endpoint/merge.test.ts,src/tests/mocks/octokit/endpoint/merge.ts,src/tests/mocks/octokit/index.test.ts,src/tests/mocks/octokit/index.ts,src/tests/mocks/octokit/paginate.test.ts,src/tests/mocks/octokit/paginate.ts,src/tests/mocks/octokit/payloads.ts,src/tests/mocks/octokit/pulls/listFiles.test.ts,src/tests/mocks/octokit/pulls/listFiles.ts,src/tests/mocks/octokit/repos/compareCommits.test.ts,src/tests/mocks/octokit/repos/compareCommits.ts,src/tests/payloads.ts,src/typings/ActionError/index.d.ts,src/typings/ChangedFiles/index.d.ts,src/typings/CoreMock/index.d.ts,src/typings/FsMock/index.d.ts,src/typings/GitHubFile/index.d.ts,src/typings/GitHubMock/index.d.ts,src/typings/Inferred/index.d.ts,src/typings/Inputs/index.d.ts,src/typings/OctokitMock/index.d.ts,src/typings/TestInput/index.d.ts,tsconfig.build.json
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user