mirror of
https://github.com/fog/fog.git
synced 2022-11-09 13:51:43 -05:00
8426fc9abf
S3 does not require normalization of S3 object keys and uses strict byte comparison of object keys, not equivalent unicode character comparisons, to store and retrieve objects. This means that storing and retrieving objects with fog would cause the objects to be inaccessible by other libraries, languages, and systems that don't normalize object keys. Given that there is no benefit to normalization, except perhaps reducing byte count of object keys, it ought to be removed.
17 lines
570 B
Ruby
17 lines
570 B
Ruby
# encoding: utf-8
|
|
|
|
Shindo.tests('AWS | signed_params', ['aws']) do
|
|
returns( Fog::AWS.escape( "'Stöp!' said Fred_-~./" ) ) { "%27St%C3%B6p%21%27%20said%20Fred_-~.%2F" }
|
|
|
|
tests('Unicode characters should be escaped') do
|
|
unicode = ["00E9".to_i(16)].pack("U*")
|
|
escaped = "%C3%A9"
|
|
returns( escaped ) { Fog::AWS.escape( unicode ) }
|
|
end
|
|
|
|
tests('Unicode characters with combining marks should be escaped') do
|
|
unicode = ["0065".to_i(16), "0301".to_i(16)].pack("U*")
|
|
escaped = "e%CC%81"
|
|
returns( escaped ) { Fog::AWS.escape( unicode ) }
|
|
end
|
|
end
|