Soft Actor-Critic for Discrete Action Settings

Petros Christodoulou
2019
1 reference

Abstract

Soft Actor-Critic is a state-of-the-art reinforcement learning algorithm for continuous action settings that is not applicable to discrete action settings. Many important settings involve discrete actions, however, and so here we derive an alternative version of the Soft Actor-Critic algorithm that is applicable to discrete action settings. We then show that, even without any hyperparameter tuning, it is competitive with the tuned model-free state-of-the-art on a selection of games from the Atari suite.

1 repository
1 reference

Code References

â–¶ ray-project/ray
1 file
â–¶ rllib/algorithms/sac/README.md
1
[SAC-Discrete](https://arxiv.org/pdf/1910.07207) is a variant of SAC that can be used for discrete action spaces is
Link copied to clipboard!