Skip to content

Conversation

@yotsuda
Copy link
Contributor

@yotsuda yotsuda commented Dec 22, 2025

PR Summary

Reimplements ConvertTo-Json using System.Text.Json under the PSJsonSerializerV2 experimental feature, with V1-compatible behavior.

PR #26624 has been superseded by this PR. The approach has been refined based on @iSazonov's feedback to focus on V1 compatibility using System.Text.Json.

Fixes #5749

PR Context

Problem

When converting objects containing dictionaries with non-string keys (such as Exception.Data which uses IDictionary with object keys), ConvertTo-Json throws NonStringKeyInDictionary error.

Solution

This implementation uses a custom JsonConverter for PSObject that handles depth control internally, as suggested by @iSazonov. Serialization is done entirely by System.Text.Json (Newtonsoft is only referenced for JObject compatibility and the existing EscapeHandling parameter type).

Non-string dictionary keys: V2 converts all dictionary keys to strings via ToString() during serialization. This allows dictionaries like Exception.Data (which uses object keys) to be serialized without errors.

Key differences from V1:

Aspect V1 (Newtonsoft) V2 (System.Text.Json)
Approach Creates intermediate Dictionary/List objects, then serializes Uses custom JsonConverter with PSObject wrapper
Serializer Newtonsoft.Json System.Text.Json
Non-string keys ❌ Throws error (#5749) ✓ Converted via ToString()
HiddenAttribute ❌ Not respected (#9847) ✓ Hidden properties excluded
Guid consistency ❌ Different output for pipeline vs -InputObject (#26635) ✓ Consistent string output
Maximum depth 100 100

The -Depth parameter behavior:

  • Default depth: 2 (same as V1)
  • Maximum depth: 100 (same as V1)
  • Objects exceeding depth are converted to string representation (same as V1)

PR Checklist

  • PR has a meaningful title
  • Summarized changes
  • Make sure all .h, .cpp, .cs, .ps1 and .psm1 files have the correct copyright header
  • This PR is ready to merge
  • Breaking changes: Experimental feature needed
    • Experimental feature name: PSJsonSerializerV2
  • User-facing changes: Documentation needed
  • Testing: New tests added

Changes Made

New Files

ConvertToJsonCommandV2.cs (+618 lines)

  • Complete V2 implementation with JsonConverterPSObject
  • Depth tracking via writer.CurrentDepth
  • Support for all V1-compatible types

ConvertTo-Json.PSJsonSerializerV2.Tests.ps1 (+189 lines)

  • 25 tests total (20 V1/V2 compatible, 5 V2 only)

Modified Files

ExperimentalFeature.cs (+5 lines)

  • Added PSJsonSerializerV2 experimental feature constant

ConvertToJsonCommand.cs (+4 lines)

  • Added [Experimental(Hide)] attribute to hide V1 when V2 is enabled

Testing

Existing Tests (ConvertTo-Json.Tests.ps1)

The existing test suite (14 tests) was verified with V2:

  • 12 passed, 1 failed, 1 pending

The single failure is a DateTime test with timezone-dependent expectations (test expects UTC-7, fails in other timezones). This is unrelated to V1/V2 changes.

New Tests (ConvertTo-Json.PSJsonSerializerV2.Tests.ps1)

V1/V2 Compatible (20 tests)

These tests verify that V2 behaves identically to V1:

  • Special types: Uri, BigInteger, enums
  • Null handling: null, DBNull, NullString, ETS properties on DBNull
  • Collections: arrays, hashtable, nested objects
  • EscapeHandling: Default, EscapeHtml, EscapeNonAscii
  • Backward compatibility: JObject, Depth, AsArray, pipeline behavior
  • Depth limits: -Depth parameter accepts 0-100 only

V2 Only - Behavior Changes (5 tests)

These tests verify V2-specific improvements:

Related Work

Previous PR

PR Title Date Status Description
#26624 Add System.Text.Json serializer for ConvertTo-Json via PSJsonSerializerV2 experimental feature Dec 2025 Superseded Initial implementation with different defaults; superseded by this PR based on @iSazonov's feedback
#11198 Port ConvertTo-Json to .Net Core Json API Nov 2019 - Apr 2022 Closed Comprehensive STJ migration attempt with 120+ comments discussing breaking changes, depth behavior, and compatibility concerns

Directly Related Issues

Issue Title Date Status Relevance
#8393 Consider removing the default -Depth value from ConvertTo-Json Dec 2018 Closed Default depth change discussion (2 → 64)
#5749 Add Parameter to ConvertTo-Json to ignore unsupported properties Dec 2017 Open Dictionary with non-string keys handling
#9847 Class hidden properties still serialized to JSON Jun 2019 Closed Fixed as side effect of STJ migration
#5797 ConvertTo-Json: unexpected behavior with objects that have ETS properties Jan 2018 Closed Serialization edge cases with NoteProperty/ScriptProperty
#6847 ConvertTo-Json Honors JsonPropertyAttribute May 2018 Closed JsonIgnoreAttribute support
#8381 ConvertTo-Json: terminate cut off branches with explicit marker Dec 2018 Closed Depth truncation visualization

Indirectly Related (ConvertFrom-Json / Depth handling)

Issue/PR Title Date Status Relevance
#3182 In ConvertFrom-Json, the max depth for deserialization Feb 2017 Closed Origin of depth parameter discussion
#8199 Add configurable maximum depth in ConvertFrom-Json with -Depth Nov 2018 Merged Depth handling precedent
#13592 ConvertFrom-JSON incorrectly deserializes dates to DateTime Sep 2020 Closed DateTime handling differences in STJ
#13598 Add a -DateKind parameter to ConvertFrom-Json Sep 2020 Closed References #11198 for STJ migration plans

Future Considerations

The following enhancements could be considered for future iterations:

  1. Streaming output - Writing directly to a stream instead of building an in-memory string would improve memory efficiency and enable safe serialization at higher depth values
  2. Circular reference detection - Leveraging ReferenceHandler.IgnoreCycles to detect and handle circular references gracefully
  3. User-defined converters - Allowing users to provide custom JsonConverter instances for specific types
  4. -JsonSerializerOptions parameter - Exposing full STJ options for advanced scenarios (deferred per discussion with @iSazonov, prototype)
  5. ETS property exclusion option - Option to exclude Extended Type System properties for safer serialization of complex .NET objects
  6. Default depth reconsideration - The default depth of 2 may be too shallow for many use cases (see Consider removing the default -Depth value from ConvertTo-Json #8393)

Based on iSazonov's feedback and testing, this commit refactors the V2
implementation to use the standard JsonSerializer.Serialize() with custom
JsonConverter classes instead of the custom PowerShellJsonWriter.

Key changes:
- Removed PowerShellJsonWriter class (~500 lines of iterative serialization)
- Implemented JsonSerializer.Serialize() with custom JsonConverters:
  - JsonConverterPSObject: Handles PSObject serialization
  - JsonConverterInt64Enum: Converts long/ulong enums to strings
  - JsonConverterNullString: Serializes NullString as null
  - JsonConverterDBNull: Serializes DBNull as null
- Updated ConvertToJsonCommandV2:
  - Changed ValidateRange from (1, int.MaxValue) to (0, 1000)
  - Changed default Depth from int.MaxValue to 64
  - Updated XML documentation to reflect System.Text.Json limitations
- Deleted separate SystemTextJsonSerializer.cs file (integrated into ConvertToJsonCommand.cs)
- Updated .gitignore to exclude test files

Rationale:
Testing revealed that Utf8JsonWriter has a hardcoded MaxDepth limit of 1000,
making the custom iterative implementation unnecessary. Stack overflow is not
a practical concern with this limit, as iSazonov correctly identified.

Net result: ~400 lines of code removed while maintaining functionality.

Status:
- Build: Success
- Basic tests: Passing
- Full test suite: 4 passed, 9 failed (edge cases need fixing)
  - Known issues: NullString/DBNull serialization, BigInteger support

This is a work-in-progress commit to preserve the refactoring effort.
Further fixes needed for full test suite compliance.
Fixed issue where PSCustomObject properties with null values were
serialized as {prop:} instead of {prop:null}.

Root cause: The check 'value is null' does not detect PowerShells
AutomationNull.Value.

Solution: Changed to use LanguagePrimitives.IsNull(value) which
properly handles PowerShells null representations.

Test results:
- Before: 11 passed, 2 failed
- After: 12 passed, 1 failed, 1 skipped
- Remaining failure: DateTime timezone (acceptable per user guidance)

All critical functionality now working correctly.
Added ReferenceHandler.IgnoreCycles to JsonSerializerOptions to detect
and handle circular references automatically using .NET's built-in
mechanism.

This provides partial mitigation for depth-related issues by preventing
infinite loops in circular object graphs, though it does not fully solve
the depth tracking issue for deeply nested non-circular objects (e.g.,
Type properties).

Related to feedback from jborean93 regarding depth issues. Full depth
tracking implementation will follow in a subsequent commit.
@iSazonov iSazonov added the CL-General Indicates that a PR should be marked as a general cmdlet change in the Change Log label Dec 22, 2025
}

// Wrap in PSObject to ensure ETS properties are preserved
var pso = PSObject.AsPSObject(objectToProcess);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think intention is to use the cmdlet in pipeline. In the case all objects coming from pipeline already wrapped in PSObject.

Or it is single way the serializer can work?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right that pipeline objects are already wrapped. However, -InputObject parameter objects are not:

$hash | ConvertTo-Json             # $hash is wrapped in PSObject
ConvertTo-Json -InputObject $hash  # $hash is NOT wrapped

PSObject.AsPSObject() is cheap (returns same object if already wrapped), so it handles both cases safely.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As soon as STJ is extended so that we can use the standard serializer, we can reconsider this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we will process raw objects as raw objects (that is discussed in another comment) we should do the same in the point too.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The entry point wraps input with PSObject.AsPSObject() (L213). Raw vs PSObject distinction is handled in nested object serialization via Option C implementation (WriteProperty and WriteValue check value is PSObject).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here we should keep V1 InputObject behavior - process the object as raw.

Really we need two things - serialize PSObject and serialize raw object.
Due to the limitations of the standard serializer we cannot use it directly for serializing raw objects.
So we have to use custom reflection by means of wrapping to PSObject.
But we still have to distinguish whether the object was raw or originally PSObject.

I suggest using a simple trick - to wrap the raw object in a new auxiliary class and create a separate converter for it. Both converters will be able to re-use the same code with one difference when handling special properties.

Alternatively, we could add some flag to the PSObject instance and use one converter, but its code will be more complicated, but the main thing is if we get STJ with the extensions we need, we'll just delete the new auxiliary class and its converter without massive code change.

@yotsuda
Copy link
Contributor Author

yotsuda commented Dec 22, 2025

@iSazonov Thank you for the thorough review!
I've pushed a commit (b37e6ae) addressing your feedback. Replying to each thread with details.

pso,
PSObject.GetPropertyCollection(memberTypes));

foreach (var prop in properties)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use:

            foreach (var prop in pso.Properties)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pso.Properties returns both Extended and Adapted properties. The current code filters by memberTypes to support -ExcludeBaseProperties, which needs Extended-only when $true.

Comment on lines +190 to +199
var options = new JsonSerializerOptions()
{
WriteIndented = !compressOutput,

// Set high value to avoid System.Text.Json exceptions
// User-specified depth is enforced by JsonConverterPSObject (max 100 via ValidateRange)
MaxDepth = 1000,
DefaultIgnoreCondition = JsonIgnoreCondition.Never,
Encoder = GetEncoder(stringEscapeHandling),
};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my understanding every instance of JsonSerializerOptions keep its own type cache. If it is true it makes sense to cache JsonSerializerOptions instances per psrunspace. The lazy cache need two parameter -compressOutput and stringEscapeHandling.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. Caching JsonSerializerOptions per runspace would improve performance. However, this requires architectural changes (runspace-level state management) that I think are beyond the scope of this PR. I'd like to address this in a follow-up PR if you agree.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can address this later.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Acknowledged. Will address JsonSerializerOptions caching in a future PR.

int currentDepth = writer.CurrentDepth;

// Handle special types - check for null-like objects (no depth increment needed)
if (LanguagePrimitives.IsNull(obj) || obj is DBNull or System.Management.Automation.Language.NullString)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we check LanguagePrimitives.IsNull(obj)? I don't see this in V1 code.
Will this change PSCustomObject processing?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The LanguagePrimitives.IsNull() check handles AutomationNull.Value in addition to regular null. This is needed because PowerShell pipeline can produce AutomationNull.Value which should be serialized as JSON null. V1 handles this differently through Newtonsoft's converters. This doesn't change PSCustomObject processing - it only affects null-like values.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is a scenario for the check?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The check handles AutomationNull.Value when it becomes the BaseObject of a PSObject.

In C#, obj is null returns false for AutomationNull.Value (it's a singleton object), but LanguagePrimitives.IsNull(obj) returns true.

Scenario:

 = [System.Management.Automation.Internal.AutomationNull]::Value
 = [PSObject]::AsPSObject()
.BaseObject -eq   # True
 | ConvertTo-Json          # Output: null

Without this check, AutomationNull.Value would be serialized as an object instead of null.

See: L266

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AutomationNull is for Engine only. AutomationNull is intended for internal use only in the pipeline for null transmission. This should not be used explicitly.

Comment on lines +359 to +362
var psoItem = PSObject.AsPSObject(item);

// Recursive call - Write will handle depth tracking
Write(writer, psoItem, options);
Copy link
Collaborator

@iSazonov iSazonov Dec 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we take into account if source object is raw or PSObject? My concern the wrapping will add extra properties (extended and adapted).
The question is for all place where we use the pattern.

I understand the trick as replacing reflection in standard STJ API with PSObject reflection.
Perhaps we should add specific property to the PSObject to indicate raw object and take it into account in PSObject converter to add or don't add extended/adapted properties.

This also assume that while we enumerating PSObject proprties we can check its kind and check for JsonIgnore.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested several scenarios and the current implementation appears to serialize only the actual properties without adding extra Extended/Adapted properties. However, I may be missing edge cases. Please let me know if you see different behavior - I'd be happy to investigate further.

Copy link
Collaborator

@iSazonov iSazonov Dec 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I compare outputs for FileInfo object in V1 version - 1. pipeline and 2.InputObject - they are different. What is result for V2? Also we could check a class with FileInfo property.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the feedback. I investigated the V1 vs V2 behavior in detail.

Summary of Findings

Direct serialization (Guid, IPAddress)

Type V1 Pipeline V1 InputObject V2 Pipeline V2 InputObject
Guid object (2 props) string string string
IPAddress 10 props (w/ IPAddressToString) 9 props (wo/ IPAddressToString) 9 props 9 props

V1 issues: Pipeline vs InputObject produces different output for Guid and IPAddress.

V2 resolves these inconsistencies - Pipeline and InputObject produce identical output.

FileInfo as a class property

class TestClass { [System.IO.FileInfo]$File }
$obj = [TestClass]::new()
$obj.File = Get-Item $env:windir\notepad.exe  # or [FileInfo]::new(...)
$obj | ConvertTo-Json -Depth 2
Source V1 Pipeline V1 InputObject V2 Pipeline V2 InputObject
Get-Item (has Extended props) 17 props 17 props 30 props 30 props
[FileInfo]::new() (no Extended props) 17 props 17 props 24 props 24 props

V1: Always outputs Base properties only (17) because Newtonsoft.Json uses direct .NET reflection on nested objects.

V2: Outputs Base (17) + Adapted (7) + Extended (0-6), depending on whether the object was PSObject-wrapped.

Possible Approaches

Approach V1 Compatible Trade-off
A. Keep current (hardcoded list) No Consistent, but requires manual updates for new types
B. Base only (PSMemberViewTypes.Base) Yes Loses Adapted props (Mode, VersionInfo, BaseName)
C. Track raw vs PSObject before wrapping Partial Replaces hardcoded list, needs careful implementation

Approach A: The current V2 implementation uses a hardcoded IsPrimitiveType list (Guid, DateTime, Uri, BigInteger, etc.). This is not ideal for long-term maintenance as new .NET types require manual updates.

Approach C: I found we can use existing mechanisms to replace the hardcoded list:

  1. Check item is PSObject before calling PSObject.AsPSObject() to decide whether to include Extended properties
  2. Use MemberType to identify property categories:
MemberType Category Example
Property Base Name, Length, FullName
NoteProperty Extended PSPath, PSDrive, PSProvider
CodeProperty Adapted Mode, ModeWithoutHardLink
ScriptProperty Adapted VersionInfo, BaseName

Question

Which approach would you prefer?

  • A: Keep current behavior (accept hardcoded list for now)
  • B: Use Base only (V1 compatible, but loses Adapted props like Mode, VersionInfo)
  • C: Implement raw vs PSObject tracking (replaces hardcoded list, partial V1 compatibility)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should preserve V1 capability:

  1. allow users to serialize raw objects with InputObject parameter
  2. serialize nested objects (property values) as raw object without adding extra properties.

It is C option.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in fdbe94ae6. Implemented Option C:

  • value is PSObject → Serialize with Extended | Adapted properties
  • Raw .NET object → Serialize with Base properties only (V1 compatible)

Key changes:

V1 compatibility verified:

Scenario V1 V2
Nested raw FileInfo 17 props 17 props
Nested Get-Item FileInfo 30 props 30 props

try
{
var value = prop.Value;
writer.WritePropertyName(prop.Name);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need a specific processing here? Why not delegate the value processing to PSObject serializer?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The maxDepth == 0 check converts non-primitive values to strings, which is V1-compatible behavior for depth limiting. If we delegated to the PSObject serializer without this check, it would attempt full serialization and fail the depth check later. Handling it here provides better control and matches V1 behavior.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With implementing option C I believe it will be changed. I'd still expect we do generic processing. V1 version does recursion here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in fdbe94a. The maxDepth == 0 handling works with Option C. See WritePropertyRaw L616-L642 which handles recursion via SerializeRawValue.


// Note: JsonIgnoreAttribute check would require reflection on the underlying member
// which may not be available for all PSPropertyInfo types. For now, we rely on
// IsHidden to filter properties that should not be serialized.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Enumerating properties in PSObject is reflection. So one more check is not add more problems.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in 0e07b33. Added JsonIgnoreAttribute check in ShouldSkipProperty. For PSProperty backed by MemberInfo, we now check for the attribute.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Maybe check NewtonSoft attribute too? We need discuss.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently only checking System.Text.Json.Serialization.JsonIgnoreAttribute: L694-L700

Question: Should we also check Newtonsoft.Json.JsonIgnoreAttribute for backward compatibility? V1 uses Newtonsoft, so users may have decorated their classes with that attribute.

…n, add nullability, use expression body, reorder null check, annotate IsNull, honor PSSerializeJSONLongEnumAsNumber, avoid double enumeration, add JsonIgnore check
}

// Wrap in PSObject to ensure ETS properties are preserved
var pso = PSObject.AsPSObject(objectToProcess);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we will process raw objects as raw objects (that is discussed in another comment) we should do the same in the point too.

int currentDepth = writer.CurrentDepth;

// Handle special types - check for null-like objects (no depth increment needed)
if (LanguagePrimitives.IsNull(obj) || obj is DBNull or System.Management.Automation.Language.NullString)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is a scenario for the check?

yotsuda and others added 5 commits December 24, 2025 18:49
…Cmdlet/ConvertToJsonCommandV2.cs

Co-authored-by: Ilya <darpa@yandex.ru>
…Cmdlet/ConvertToJsonCommandV2.cs

Co-authored-by: Ilya <darpa@yandex.ru>
…Cmdlet/ConvertToJsonCommandV2.cs

Co-authored-by: Ilya <darpa@yandex.ru>
@yotsuda
Copy link
Contributor Author

yotsuda commented Dec 25, 2025

Summary

All requested changes have been implemented and tested.

Test File V1 V2
ConvertTo-Json.Tests.ps1 31 Passed, 1 Failed* 31 Passed, 1 Failed*
ConvertTo-Json.PSJsonSerializerV2.Tests.ps1 5 Skipped 5 Passed

*DateTime test fails due to timezone (existing issue, not related to this PR)

Commits to push

Commit Description
fdbe94a Match V1 serialization for nested raw objects (includes DBNull/NullString fix)
c7b7ab1 Sync ConvertTo-Json.Tests.ps1 with PR #26639

@iSazonov
Copy link
Collaborator

*DateTime test fails due to timezone (existing issue, not related to this PR)

Could you please point the test in new issue?

@yotsuda
Copy link
Contributor Author

yotsuda commented Dec 25, 2025

*DateTime test fails due to timezone (existing issue, not related to this PR)

Could you please point the test in new issue?

@iSazonov This was a local environment issue on my machine. I shouldn't have mentioned it. Sorry for the confusion.

@yotsuda
Copy link
Contributor Author

yotsuda commented Dec 25, 2025

As @iSazonov suggested:

Here we should keep V1 InputObject behavior - process the object as raw.

Really we need two things - serialize PSObject and serialize raw object. Due to the limitations of the standard serializer we cannot use it directly for serializing raw objects. So we have to use custom reflection by means of wrapping to PSObject. But we still have to distinguish whether the object was raw or originally PSObject.

I suggest using a simple trick - to wrap the raw object in a new auxiliary class and create a separate converter for it. Both converters will be able to re-use the same code with one difference when handling special properties.

Alternatively, we could add some flag to the PSObject instance and use one converter, but its code will be more complicated, but the main thing is if we get STJ with the extensions we need, we'll just delete the new auxiliary class and its converter without massive code change.

Implemented RawObjectWrapper approach (aa9a89f):

  1. Entry point now branches on objectToProcess is PSObject:

    • PSObject → JsonConverterPSObject (Extended/Adapted properties)
    • Raw object → RawObjectWrapper + JsonConverterRawObject (Base properties only)
  2. Both converters share similar structure for easy future removal when STJ gains needed extensions.

Removed hardcoded type list:

With the raw/PSObject distinction in place, IsPrimitiveType (hardcoded list of DateTime, Guid, Uri, etc.) is no longer needed. Replaced with IsStjNativeScalarType() which dynamically detects if STJ natively serializes a type as scalar by invoking STJ and checking if result starts with { or [. Results are cached per type. This automatically adapts to future .NET/STJ type additions (DateOnly, TimeOnly, etc.).

STJ limitation workarounds (not a type list):

  • BigInteger: STJ serializes as object, but JSON standard expects number
  • Infinity/NaN: STJ throws, but V1 serializes as string

Test Results:

  • Pester V2: 5/5 passed
  • Pester Common: 31/33 passed (1 DateTime timezone issue, 1 pending)
  • xUnit: 5/5 passed

Copy link
Collaborator

@iSazonov iSazonov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My previous idea was so that we could just delete the JsonConverterRawObject when it is not needed. But I see that a lot of code/logic is duplicated. We could get rid of this by using helper methods. If they complicate the code too much, we may choose a different path:

Only difference between JsonConverterPSObject and JsonConverterRawObject is in processing of psobject properties. I'd think to unify JsonConverterPSObject and JsonConverterRawObject, to make one PSJsonConverterObject where T is either PSOject or PSRawObjectWrapper. Then we could use this in condition type(T) == type(PSOject) or type(T) == type(RawObjectWrapper) to select appropriate code. In this case, removing the RawObjectWrapper will also be easy.
Or use a delegate for specific "processing of psobject properties".

Comment on lines +43 to +49
It 'Should serialize Guid as string consistently' {
$guid = [guid]"12345678-1234-1234-1234-123456789abc"
$jsonPipeline = $guid | ConvertTo-Json -Compress
$jsonInputObject = ConvertTo-Json -InputObject $guid -Compress
$jsonPipeline | Should -BeExactly '"12345678-1234-1234-1234-123456789abc"'
$jsonInputObject | Should -BeExactly '"12345678-1234-1234-1234-123456789abc"'
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean we have a breaking change with the scenario?

$dict = @{ 1 = "one"; 2 = "two" }
$json = $dict | ConvertTo-Json -Compress
$json | Should -Match '"1":\s*"one"'
$json | Should -Match '"2":\s*"two"'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we use Compress we can explicitly compare with {"2":"two","1":"one"}.
I suggest use the pattern in (all) other tests too. It is more reliable and more readable. It is true for "-Not -Throw" - it is better replace its with such "positive" tests if possible.

int currentDepth = writer.CurrentDepth;

// Handle special types - check for null-like objects (no depth increment needed)
if (LanguagePrimitives.IsNull(obj) || obj is DBNull or System.Management.Automation.Language.NullString)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AutomationNull is for Engine only. AutomationNull is intended for internal use only in the pipeline for null transmission. This should not be used explicitly.

bool wroteStart = false;
foreach (var prop in etsProperties)
{
if (!JsonSerializerHelper.ShouldSkipProperty(prop))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use the same pattern everywhere. See line 450 and 760.

pso,
PSObject.GetPropertyCollection(PSMemberViewTypes.Extended));

bool wroteStart = false;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the context (how we use it) I'd rename to isNotNull or reverse to bool isNull = true;

}
else
{
var psoItem = PSObject.AsPSObject(item);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we do the same as in line 214? (Wrap as raw object if needed?)

return;
}

var pso = PSObject.AsPSObject(value);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The same.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CL-General Indicates that a PR should be marked as a general cmdlet change in the Change Log

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add Parameter to ConvertTo-Json to ignore unsupported properties

2 participants