Commit 5cde0af2a9825dd1edaca233bd9590566579ef21

Authored by Herbert Xu
1 parent 5c64097aa0

[CRYPTO] cipher: Added block cipher type

This patch adds the new type of block ciphers.  Unlike current cipher
algorithms which operate on a single block at a time, block ciphers
operate on an arbitrarily long linear area of data.  As it is block-based,
it will skip any data remaining at the end which cannot form a block.

The block cipher has one major difference when compared to the existing
block cipher implementation.  The sg walking is now performed by the
algorithm rather than the cipher mid-layer.  This is needed for drivers
that directly support sg lists.  It also improves performance for all
algorithms as it reduces the total number of indirect calls by one.

In future the existing cipher algorithm will be converted to only have
a single-block interface.  This will be done after all existing users
have switched over to the new block cipher type.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>

Showing 5 changed files with 655 additions and 0 deletions Inline Diff

1 # 1 #
2 # Cryptographic API Configuration 2 # Cryptographic API Configuration
3 # 3 #
4 4
5 menu "Cryptographic options" 5 menu "Cryptographic options"
6 6
7 config CRYPTO 7 config CRYPTO
8 bool "Cryptographic API" 8 bool "Cryptographic API"
9 help 9 help
10 This option provides the core Cryptographic API. 10 This option provides the core Cryptographic API.
11 11
12 if CRYPTO 12 if CRYPTO
13 13
14 config CRYPTO_ALGAPI 14 config CRYPTO_ALGAPI
15 tristate 15 tristate
16 help 16 help
17 This option provides the API for cryptographic algorithms. 17 This option provides the API for cryptographic algorithms.
18 18
19 config CRYPTO_BLKCIPHER
20 tristate
21 select CRYPTO_ALGAPI
22
19 config CRYPTO_MANAGER 23 config CRYPTO_MANAGER
20 tristate "Cryptographic algorithm manager" 24 tristate "Cryptographic algorithm manager"
21 select CRYPTO_ALGAPI 25 select CRYPTO_ALGAPI
22 default m 26 default m
23 help 27 help
24 Create default cryptographic template instantiations such as 28 Create default cryptographic template instantiations such as
25 cbc(aes). 29 cbc(aes).
26 30
27 config CRYPTO_HMAC 31 config CRYPTO_HMAC
28 bool "HMAC support" 32 bool "HMAC support"
29 help 33 help
30 HMAC: Keyed-Hashing for Message Authentication (RFC2104). 34 HMAC: Keyed-Hashing for Message Authentication (RFC2104).
31 This is required for IPSec. 35 This is required for IPSec.
32 36
33 config CRYPTO_NULL 37 config CRYPTO_NULL
34 tristate "Null algorithms" 38 tristate "Null algorithms"
35 select CRYPTO_ALGAPI 39 select CRYPTO_ALGAPI
36 help 40 help
37 These are 'Null' algorithms, used by IPsec, which do nothing. 41 These are 'Null' algorithms, used by IPsec, which do nothing.
38 42
39 config CRYPTO_MD4 43 config CRYPTO_MD4
40 tristate "MD4 digest algorithm" 44 tristate "MD4 digest algorithm"
41 select CRYPTO_ALGAPI 45 select CRYPTO_ALGAPI
42 help 46 help
43 MD4 message digest algorithm (RFC1320). 47 MD4 message digest algorithm (RFC1320).
44 48
45 config CRYPTO_MD5 49 config CRYPTO_MD5
46 tristate "MD5 digest algorithm" 50 tristate "MD5 digest algorithm"
47 select CRYPTO_ALGAPI 51 select CRYPTO_ALGAPI
48 help 52 help
49 MD5 message digest algorithm (RFC1321). 53 MD5 message digest algorithm (RFC1321).
50 54
51 config CRYPTO_SHA1 55 config CRYPTO_SHA1
52 tristate "SHA1 digest algorithm" 56 tristate "SHA1 digest algorithm"
53 select CRYPTO_ALGAPI 57 select CRYPTO_ALGAPI
54 help 58 help
55 SHA-1 secure hash standard (FIPS 180-1/DFIPS 180-2). 59 SHA-1 secure hash standard (FIPS 180-1/DFIPS 180-2).
56 60
57 config CRYPTO_SHA1_S390 61 config CRYPTO_SHA1_S390
58 tristate "SHA1 digest algorithm (s390)" 62 tristate "SHA1 digest algorithm (s390)"
59 depends on S390 63 depends on S390
60 select CRYPTO_ALGAPI 64 select CRYPTO_ALGAPI
61 help 65 help
62 This is the s390 hardware accelerated implementation of the 66 This is the s390 hardware accelerated implementation of the
63 SHA-1 secure hash standard (FIPS 180-1/DFIPS 180-2). 67 SHA-1 secure hash standard (FIPS 180-1/DFIPS 180-2).
64 68
65 config CRYPTO_SHA256 69 config CRYPTO_SHA256
66 tristate "SHA256 digest algorithm" 70 tristate "SHA256 digest algorithm"
67 select CRYPTO_ALGAPI 71 select CRYPTO_ALGAPI
68 help 72 help
69 SHA256 secure hash standard (DFIPS 180-2). 73 SHA256 secure hash standard (DFIPS 180-2).
70 74
71 This version of SHA implements a 256 bit hash with 128 bits of 75 This version of SHA implements a 256 bit hash with 128 bits of
72 security against collision attacks. 76 security against collision attacks.
73 77
74 config CRYPTO_SHA256_S390 78 config CRYPTO_SHA256_S390
75 tristate "SHA256 digest algorithm (s390)" 79 tristate "SHA256 digest algorithm (s390)"
76 depends on S390 80 depends on S390
77 select CRYPTO_ALGAPI 81 select CRYPTO_ALGAPI
78 help 82 help
79 This is the s390 hardware accelerated implementation of the 83 This is the s390 hardware accelerated implementation of the
80 SHA256 secure hash standard (DFIPS 180-2). 84 SHA256 secure hash standard (DFIPS 180-2).
81 85
82 This version of SHA implements a 256 bit hash with 128 bits of 86 This version of SHA implements a 256 bit hash with 128 bits of
83 security against collision attacks. 87 security against collision attacks.
84 88
85 config CRYPTO_SHA512 89 config CRYPTO_SHA512
86 tristate "SHA384 and SHA512 digest algorithms" 90 tristate "SHA384 and SHA512 digest algorithms"
87 select CRYPTO_ALGAPI 91 select CRYPTO_ALGAPI
88 help 92 help
89 SHA512 secure hash standard (DFIPS 180-2). 93 SHA512 secure hash standard (DFIPS 180-2).
90 94
91 This version of SHA implements a 512 bit hash with 256 bits of 95 This version of SHA implements a 512 bit hash with 256 bits of
92 security against collision attacks. 96 security against collision attacks.
93 97
94 This code also includes SHA-384, a 384 bit hash with 192 bits 98 This code also includes SHA-384, a 384 bit hash with 192 bits
95 of security against collision attacks. 99 of security against collision attacks.
96 100
97 config CRYPTO_WP512 101 config CRYPTO_WP512
98 tristate "Whirlpool digest algorithms" 102 tristate "Whirlpool digest algorithms"
99 select CRYPTO_ALGAPI 103 select CRYPTO_ALGAPI
100 help 104 help
101 Whirlpool hash algorithm 512, 384 and 256-bit hashes 105 Whirlpool hash algorithm 512, 384 and 256-bit hashes
102 106
103 Whirlpool-512 is part of the NESSIE cryptographic primitives. 107 Whirlpool-512 is part of the NESSIE cryptographic primitives.
104 Whirlpool will be part of the ISO/IEC 10118-3:2003(E) standard 108 Whirlpool will be part of the ISO/IEC 10118-3:2003(E) standard
105 109
106 See also: 110 See also:
107 <http://planeta.terra.com.br/informatica/paulobarreto/WhirlpoolPage.html> 111 <http://planeta.terra.com.br/informatica/paulobarreto/WhirlpoolPage.html>
108 112
109 config CRYPTO_TGR192 113 config CRYPTO_TGR192
110 tristate "Tiger digest algorithms" 114 tristate "Tiger digest algorithms"
111 select CRYPTO_ALGAPI 115 select CRYPTO_ALGAPI
112 help 116 help
113 Tiger hash algorithm 192, 160 and 128-bit hashes 117 Tiger hash algorithm 192, 160 and 128-bit hashes
114 118
115 Tiger is a hash function optimized for 64-bit processors while 119 Tiger is a hash function optimized for 64-bit processors while
116 still having decent performance on 32-bit processors. 120 still having decent performance on 32-bit processors.
117 Tiger was developed by Ross Anderson and Eli Biham. 121 Tiger was developed by Ross Anderson and Eli Biham.
118 122
119 See also: 123 See also:
120 <http://www.cs.technion.ac.il/~biham/Reports/Tiger/>. 124 <http://www.cs.technion.ac.il/~biham/Reports/Tiger/>.
121 125
122 config CRYPTO_DES 126 config CRYPTO_DES
123 tristate "DES and Triple DES EDE cipher algorithms" 127 tristate "DES and Triple DES EDE cipher algorithms"
124 select CRYPTO_ALGAPI 128 select CRYPTO_ALGAPI
125 help 129 help
126 DES cipher algorithm (FIPS 46-2), and Triple DES EDE (FIPS 46-3). 130 DES cipher algorithm (FIPS 46-2), and Triple DES EDE (FIPS 46-3).
127 131
128 config CRYPTO_DES_S390 132 config CRYPTO_DES_S390
129 tristate "DES and Triple DES cipher algorithms (s390)" 133 tristate "DES and Triple DES cipher algorithms (s390)"
130 depends on S390 134 depends on S390
131 select CRYPTO_ALGAPI 135 select CRYPTO_ALGAPI
132 help 136 help
133 DES cipher algorithm (FIPS 46-2), and Triple DES EDE (FIPS 46-3). 137 DES cipher algorithm (FIPS 46-2), and Triple DES EDE (FIPS 46-3).
134 138
135 config CRYPTO_BLOWFISH 139 config CRYPTO_BLOWFISH
136 tristate "Blowfish cipher algorithm" 140 tristate "Blowfish cipher algorithm"
137 select CRYPTO_ALGAPI 141 select CRYPTO_ALGAPI
138 help 142 help
139 Blowfish cipher algorithm, by Bruce Schneier. 143 Blowfish cipher algorithm, by Bruce Schneier.
140 144
141 This is a variable key length cipher which can use keys from 32 145 This is a variable key length cipher which can use keys from 32
142 bits to 448 bits in length. It's fast, simple and specifically 146 bits to 448 bits in length. It's fast, simple and specifically
143 designed for use on "large microprocessors". 147 designed for use on "large microprocessors".
144 148
145 See also: 149 See also:
146 <http://www.schneier.com/blowfish.html> 150 <http://www.schneier.com/blowfish.html>
147 151
148 config CRYPTO_TWOFISH 152 config CRYPTO_TWOFISH
149 tristate "Twofish cipher algorithm" 153 tristate "Twofish cipher algorithm"
150 select CRYPTO_ALGAPI 154 select CRYPTO_ALGAPI
151 select CRYPTO_TWOFISH_COMMON 155 select CRYPTO_TWOFISH_COMMON
152 help 156 help
153 Twofish cipher algorithm. 157 Twofish cipher algorithm.
154 158
155 Twofish was submitted as an AES (Advanced Encryption Standard) 159 Twofish was submitted as an AES (Advanced Encryption Standard)
156 candidate cipher by researchers at CounterPane Systems. It is a 160 candidate cipher by researchers at CounterPane Systems. It is a
157 16 round block cipher supporting key sizes of 128, 192, and 256 161 16 round block cipher supporting key sizes of 128, 192, and 256
158 bits. 162 bits.
159 163
160 See also: 164 See also:
161 <http://www.schneier.com/twofish.html> 165 <http://www.schneier.com/twofish.html>
162 166
163 config CRYPTO_TWOFISH_COMMON 167 config CRYPTO_TWOFISH_COMMON
164 tristate 168 tristate
165 help 169 help
166 Common parts of the Twofish cipher algorithm shared by the 170 Common parts of the Twofish cipher algorithm shared by the
167 generic c and the assembler implementations. 171 generic c and the assembler implementations.
168 172
169 config CRYPTO_TWOFISH_586 173 config CRYPTO_TWOFISH_586
170 tristate "Twofish cipher algorithms (i586)" 174 tristate "Twofish cipher algorithms (i586)"
171 depends on (X86 || UML_X86) && !64BIT 175 depends on (X86 || UML_X86) && !64BIT
172 select CRYPTO_ALGAPI 176 select CRYPTO_ALGAPI
173 select CRYPTO_TWOFISH_COMMON 177 select CRYPTO_TWOFISH_COMMON
174 help 178 help
175 Twofish cipher algorithm. 179 Twofish cipher algorithm.
176 180
177 Twofish was submitted as an AES (Advanced Encryption Standard) 181 Twofish was submitted as an AES (Advanced Encryption Standard)
178 candidate cipher by researchers at CounterPane Systems. It is a 182 candidate cipher by researchers at CounterPane Systems. It is a
179 16 round block cipher supporting key sizes of 128, 192, and 256 183 16 round block cipher supporting key sizes of 128, 192, and 256
180 bits. 184 bits.
181 185
182 See also: 186 See also:
183 <http://www.schneier.com/twofish.html> 187 <http://www.schneier.com/twofish.html>
184 188
185 config CRYPTO_TWOFISH_X86_64 189 config CRYPTO_TWOFISH_X86_64
186 tristate "Twofish cipher algorithm (x86_64)" 190 tristate "Twofish cipher algorithm (x86_64)"
187 depends on (X86 || UML_X86) && 64BIT 191 depends on (X86 || UML_X86) && 64BIT
188 select CRYPTO_ALGAPI 192 select CRYPTO_ALGAPI
189 select CRYPTO_TWOFISH_COMMON 193 select CRYPTO_TWOFISH_COMMON
190 help 194 help
191 Twofish cipher algorithm (x86_64). 195 Twofish cipher algorithm (x86_64).
192 196
193 Twofish was submitted as an AES (Advanced Encryption Standard) 197 Twofish was submitted as an AES (Advanced Encryption Standard)
194 candidate cipher by researchers at CounterPane Systems. It is a 198 candidate cipher by researchers at CounterPane Systems. It is a
195 16 round block cipher supporting key sizes of 128, 192, and 256 199 16 round block cipher supporting key sizes of 128, 192, and 256
196 bits. 200 bits.
197 201
198 See also: 202 See also:
199 <http://www.schneier.com/twofish.html> 203 <http://www.schneier.com/twofish.html>
200 204
201 config CRYPTO_SERPENT 205 config CRYPTO_SERPENT
202 tristate "Serpent cipher algorithm" 206 tristate "Serpent cipher algorithm"
203 select CRYPTO_ALGAPI 207 select CRYPTO_ALGAPI
204 help 208 help
205 Serpent cipher algorithm, by Anderson, Biham & Knudsen. 209 Serpent cipher algorithm, by Anderson, Biham & Knudsen.
206 210
207 Keys are allowed to be from 0 to 256 bits in length, in steps 211 Keys are allowed to be from 0 to 256 bits in length, in steps
208 of 8 bits. Also includes the 'Tnepres' algorithm, a reversed 212 of 8 bits. Also includes the 'Tnepres' algorithm, a reversed
209 variant of Serpent for compatibility with old kerneli code. 213 variant of Serpent for compatibility with old kerneli code.
210 214
211 See also: 215 See also:
212 <http://www.cl.cam.ac.uk/~rja14/serpent.html> 216 <http://www.cl.cam.ac.uk/~rja14/serpent.html>
213 217
214 config CRYPTO_AES 218 config CRYPTO_AES
215 tristate "AES cipher algorithms" 219 tristate "AES cipher algorithms"
216 select CRYPTO_ALGAPI 220 select CRYPTO_ALGAPI
217 help 221 help
218 AES cipher algorithms (FIPS-197). AES uses the Rijndael 222 AES cipher algorithms (FIPS-197). AES uses the Rijndael
219 algorithm. 223 algorithm.
220 224
221 Rijndael appears to be consistently a very good performer in 225 Rijndael appears to be consistently a very good performer in
222 both hardware and software across a wide range of computing 226 both hardware and software across a wide range of computing
223 environments regardless of its use in feedback or non-feedback 227 environments regardless of its use in feedback or non-feedback
224 modes. Its key setup time is excellent, and its key agility is 228 modes. Its key setup time is excellent, and its key agility is
225 good. Rijndael's very low memory requirements make it very well 229 good. Rijndael's very low memory requirements make it very well
226 suited for restricted-space environments, in which it also 230 suited for restricted-space environments, in which it also
227 demonstrates excellent performance. Rijndael's operations are 231 demonstrates excellent performance. Rijndael's operations are
228 among the easiest to defend against power and timing attacks. 232 among the easiest to defend against power and timing attacks.
229 233
230 The AES specifies three key sizes: 128, 192 and 256 bits 234 The AES specifies three key sizes: 128, 192 and 256 bits
231 235
232 See <http://csrc.nist.gov/CryptoToolkit/aes/> for more information. 236 See <http://csrc.nist.gov/CryptoToolkit/aes/> for more information.
233 237
234 config CRYPTO_AES_586 238 config CRYPTO_AES_586
235 tristate "AES cipher algorithms (i586)" 239 tristate "AES cipher algorithms (i586)"
236 depends on (X86 || UML_X86) && !64BIT 240 depends on (X86 || UML_X86) && !64BIT
237 select CRYPTO_ALGAPI 241 select CRYPTO_ALGAPI
238 help 242 help
239 AES cipher algorithms (FIPS-197). AES uses the Rijndael 243 AES cipher algorithms (FIPS-197). AES uses the Rijndael
240 algorithm. 244 algorithm.
241 245
242 Rijndael appears to be consistently a very good performer in 246 Rijndael appears to be consistently a very good performer in
243 both hardware and software across a wide range of computing 247 both hardware and software across a wide range of computing
244 environments regardless of its use in feedback or non-feedback 248 environments regardless of its use in feedback or non-feedback
245 modes. Its key setup time is excellent, and its key agility is 249 modes. Its key setup time is excellent, and its key agility is
246 good. Rijndael's very low memory requirements make it very well 250 good. Rijndael's very low memory requirements make it very well
247 suited for restricted-space environments, in which it also 251 suited for restricted-space environments, in which it also
248 demonstrates excellent performance. Rijndael's operations are 252 demonstrates excellent performance. Rijndael's operations are
249 among the easiest to defend against power and timing attacks. 253 among the easiest to defend against power and timing attacks.
250 254
251 The AES specifies three key sizes: 128, 192 and 256 bits 255 The AES specifies three key sizes: 128, 192 and 256 bits
252 256
253 See <http://csrc.nist.gov/encryption/aes/> for more information. 257 See <http://csrc.nist.gov/encryption/aes/> for more information.
254 258
255 config CRYPTO_AES_X86_64 259 config CRYPTO_AES_X86_64
256 tristate "AES cipher algorithms (x86_64)" 260 tristate "AES cipher algorithms (x86_64)"
257 depends on (X86 || UML_X86) && 64BIT 261 depends on (X86 || UML_X86) && 64BIT
258 select CRYPTO_ALGAPI 262 select CRYPTO_ALGAPI
259 help 263 help
260 AES cipher algorithms (FIPS-197). AES uses the Rijndael 264 AES cipher algorithms (FIPS-197). AES uses the Rijndael
261 algorithm. 265 algorithm.
262 266
263 Rijndael appears to be consistently a very good performer in 267 Rijndael appears to be consistently a very good performer in
264 both hardware and software across a wide range of computing 268 both hardware and software across a wide range of computing
265 environments regardless of its use in feedback or non-feedback 269 environments regardless of its use in feedback or non-feedback
266 modes. Its key setup time is excellent, and its key agility is 270 modes. Its key setup time is excellent, and its key agility is
267 good. Rijndael's very low memory requirements make it very well 271 good. Rijndael's very low memory requirements make it very well
268 suited for restricted-space environments, in which it also 272 suited for restricted-space environments, in which it also
269 demonstrates excellent performance. Rijndael's operations are 273 demonstrates excellent performance. Rijndael's operations are
270 among the easiest to defend against power and timing attacks. 274 among the easiest to defend against power and timing attacks.
271 275
272 The AES specifies three key sizes: 128, 192 and 256 bits 276 The AES specifies three key sizes: 128, 192 and 256 bits
273 277
274 See <http://csrc.nist.gov/encryption/aes/> for more information. 278 See <http://csrc.nist.gov/encryption/aes/> for more information.
275 279
276 config CRYPTO_AES_S390 280 config CRYPTO_AES_S390
277 tristate "AES cipher algorithms (s390)" 281 tristate "AES cipher algorithms (s390)"
278 depends on S390 282 depends on S390
279 select CRYPTO_ALGAPI 283 select CRYPTO_ALGAPI
280 help 284 help
281 This is the s390 hardware accelerated implementation of the 285 This is the s390 hardware accelerated implementation of the
282 AES cipher algorithms (FIPS-197). AES uses the Rijndael 286 AES cipher algorithms (FIPS-197). AES uses the Rijndael
283 algorithm. 287 algorithm.
284 288
285 Rijndael appears to be consistently a very good performer in 289 Rijndael appears to be consistently a very good performer in
286 both hardware and software across a wide range of computing 290 both hardware and software across a wide range of computing
287 environments regardless of its use in feedback or non-feedback 291 environments regardless of its use in feedback or non-feedback
288 modes. Its key setup time is excellent, and its key agility is 292 modes. Its key setup time is excellent, and its key agility is
289 good. Rijndael's very low memory requirements make it very well 293 good. Rijndael's very low memory requirements make it very well
290 suited for restricted-space environments, in which it also 294 suited for restricted-space environments, in which it also
291 demonstrates excellent performance. Rijndael's operations are 295 demonstrates excellent performance. Rijndael's operations are
292 among the easiest to defend against power and timing attacks. 296 among the easiest to defend against power and timing attacks.
293 297
294 On s390 the System z9-109 currently only supports the key size 298 On s390 the System z9-109 currently only supports the key size
295 of 128 bit. 299 of 128 bit.
296 300
297 config CRYPTO_CAST5 301 config CRYPTO_CAST5
298 tristate "CAST5 (CAST-128) cipher algorithm" 302 tristate "CAST5 (CAST-128) cipher algorithm"
299 select CRYPTO_ALGAPI 303 select CRYPTO_ALGAPI
300 help 304 help
301 The CAST5 encryption algorithm (synonymous with CAST-128) is 305 The CAST5 encryption algorithm (synonymous with CAST-128) is
302 described in RFC2144. 306 described in RFC2144.
303 307
304 config CRYPTO_CAST6 308 config CRYPTO_CAST6
305 tristate "CAST6 (CAST-256) cipher algorithm" 309 tristate "CAST6 (CAST-256) cipher algorithm"
306 select CRYPTO_ALGAPI 310 select CRYPTO_ALGAPI
307 help 311 help
308 The CAST6 encryption algorithm (synonymous with CAST-256) is 312 The CAST6 encryption algorithm (synonymous with CAST-256) is
309 described in RFC2612. 313 described in RFC2612.
310 314
311 config CRYPTO_TEA 315 config CRYPTO_TEA
312 tristate "TEA, XTEA and XETA cipher algorithms" 316 tristate "TEA, XTEA and XETA cipher algorithms"
313 select CRYPTO_ALGAPI 317 select CRYPTO_ALGAPI
314 help 318 help
315 TEA cipher algorithm. 319 TEA cipher algorithm.
316 320
317 Tiny Encryption Algorithm is a simple cipher that uses 321 Tiny Encryption Algorithm is a simple cipher that uses
318 many rounds for security. It is very fast and uses 322 many rounds for security. It is very fast and uses
319 little memory. 323 little memory.
320 324
321 Xtendend Tiny Encryption Algorithm is a modification to 325 Xtendend Tiny Encryption Algorithm is a modification to
322 the TEA algorithm to address a potential key weakness 326 the TEA algorithm to address a potential key weakness
323 in the TEA algorithm. 327 in the TEA algorithm.
324 328
325 Xtendend Encryption Tiny Algorithm is a mis-implementation 329 Xtendend Encryption Tiny Algorithm is a mis-implementation
326 of the XTEA algorithm for compatibility purposes. 330 of the XTEA algorithm for compatibility purposes.
327 331
328 config CRYPTO_ARC4 332 config CRYPTO_ARC4
329 tristate "ARC4 cipher algorithm" 333 tristate "ARC4 cipher algorithm"
330 select CRYPTO_ALGAPI 334 select CRYPTO_ALGAPI
331 help 335 help
332 ARC4 cipher algorithm. 336 ARC4 cipher algorithm.
333 337
334 ARC4 is a stream cipher using keys ranging from 8 bits to 2048 338 ARC4 is a stream cipher using keys ranging from 8 bits to 2048
335 bits in length. This algorithm is required for driver-based 339 bits in length. This algorithm is required for driver-based
336 WEP, but it should not be for other purposes because of the 340 WEP, but it should not be for other purposes because of the
337 weakness of the algorithm. 341 weakness of the algorithm.
338 342
339 config CRYPTO_KHAZAD 343 config CRYPTO_KHAZAD
340 tristate "Khazad cipher algorithm" 344 tristate "Khazad cipher algorithm"
341 select CRYPTO_ALGAPI 345 select CRYPTO_ALGAPI
342 help 346 help
343 Khazad cipher algorithm. 347 Khazad cipher algorithm.
344 348
345 Khazad was a finalist in the initial NESSIE competition. It is 349 Khazad was a finalist in the initial NESSIE competition. It is
346 an algorithm optimized for 64-bit processors with good performance 350 an algorithm optimized for 64-bit processors with good performance
347 on 32-bit processors. Khazad uses an 128 bit key size. 351 on 32-bit processors. Khazad uses an 128 bit key size.
348 352
349 See also: 353 See also:
350 <http://planeta.terra.com.br/informatica/paulobarreto/KhazadPage.html> 354 <http://planeta.terra.com.br/informatica/paulobarreto/KhazadPage.html>
351 355
352 config CRYPTO_ANUBIS 356 config CRYPTO_ANUBIS
353 tristate "Anubis cipher algorithm" 357 tristate "Anubis cipher algorithm"
354 select CRYPTO_ALGAPI 358 select CRYPTO_ALGAPI
355 help 359 help
356 Anubis cipher algorithm. 360 Anubis cipher algorithm.
357 361
358 Anubis is a variable key length cipher which can use keys from 362 Anubis is a variable key length cipher which can use keys from
359 128 bits to 320 bits in length. It was evaluated as a entrant 363 128 bits to 320 bits in length. It was evaluated as a entrant
360 in the NESSIE competition. 364 in the NESSIE competition.
361 365
362 See also: 366 See also:
363 <https://www.cosic.esat.kuleuven.ac.be/nessie/reports/> 367 <https://www.cosic.esat.kuleuven.ac.be/nessie/reports/>
364 <http://planeta.terra.com.br/informatica/paulobarreto/AnubisPage.html> 368 <http://planeta.terra.com.br/informatica/paulobarreto/AnubisPage.html>
365 369
366 370
367 config CRYPTO_DEFLATE 371 config CRYPTO_DEFLATE
368 tristate "Deflate compression algorithm" 372 tristate "Deflate compression algorithm"
369 select CRYPTO_ALGAPI 373 select CRYPTO_ALGAPI
370 select ZLIB_INFLATE 374 select ZLIB_INFLATE
371 select ZLIB_DEFLATE 375 select ZLIB_DEFLATE
372 help 376 help
373 This is the Deflate algorithm (RFC1951), specified for use in 377 This is the Deflate algorithm (RFC1951), specified for use in
374 IPSec with the IPCOMP protocol (RFC3173, RFC2394). 378 IPSec with the IPCOMP protocol (RFC3173, RFC2394).
375 379
376 You will most probably want this if using IPSec. 380 You will most probably want this if using IPSec.
377 381
378 config CRYPTO_MICHAEL_MIC 382 config CRYPTO_MICHAEL_MIC
379 tristate "Michael MIC keyed digest algorithm" 383 tristate "Michael MIC keyed digest algorithm"
380 select CRYPTO_ALGAPI 384 select CRYPTO_ALGAPI
381 help 385 help
382 Michael MIC is used for message integrity protection in TKIP 386 Michael MIC is used for message integrity protection in TKIP
383 (IEEE 802.11i). This algorithm is required for TKIP, but it 387 (IEEE 802.11i). This algorithm is required for TKIP, but it
384 should not be used for other purposes because of the weakness 388 should not be used for other purposes because of the weakness
385 of the algorithm. 389 of the algorithm.
386 390
387 config CRYPTO_CRC32C 391 config CRYPTO_CRC32C
388 tristate "CRC32c CRC algorithm" 392 tristate "CRC32c CRC algorithm"
389 select CRYPTO_ALGAPI 393 select CRYPTO_ALGAPI
390 select LIBCRC32C 394 select LIBCRC32C
391 help 395 help
392 Castagnoli, et al Cyclic Redundancy-Check Algorithm. Used 396 Castagnoli, et al Cyclic Redundancy-Check Algorithm. Used
393 by iSCSI for header and data digests and by others. 397 by iSCSI for header and data digests and by others.
394 See Castagnoli93. This implementation uses lib/libcrc32c. 398 See Castagnoli93. This implementation uses lib/libcrc32c.
395 Module will be crc32c. 399 Module will be crc32c.
396 400
397 config CRYPTO_TEST 401 config CRYPTO_TEST
398 tristate "Testing module" 402 tristate "Testing module"
399 depends on m 403 depends on m
400 select CRYPTO_ALGAPI 404 select CRYPTO_ALGAPI
401 help 405 help
402 Quick & dirty crypto test module. 406 Quick & dirty crypto test module.
403 407
404 source "drivers/crypto/Kconfig" 408 source "drivers/crypto/Kconfig"
405 409
406 endif # if CRYPTO 410 endif # if CRYPTO
407 411
408 endmenu 412 endmenu
409 413
1 # 1 #
2 # Cryptographic API 2 # Cryptographic API
3 # 3 #
4 4
5 obj-$(CONFIG_CRYPTO) += api.o scatterwalk.o cipher.o digest.o compress.o 5 obj-$(CONFIG_CRYPTO) += api.o scatterwalk.o cipher.o digest.o compress.o
6 6
7 crypto_algapi-$(CONFIG_PROC_FS) += proc.o 7 crypto_algapi-$(CONFIG_PROC_FS) += proc.o
8 crypto_algapi-objs := algapi.o $(crypto_algapi-y) 8 crypto_algapi-objs := algapi.o $(crypto_algapi-y)
9 obj-$(CONFIG_CRYPTO_ALGAPI) += crypto_algapi.o 9 obj-$(CONFIG_CRYPTO_ALGAPI) += crypto_algapi.o
10 10
11 obj-$(CONFIG_CRYPTO_BLKCIPHER) += blkcipher.o
12
11 obj-$(CONFIG_CRYPTO_MANAGER) += cryptomgr.o 13 obj-$(CONFIG_CRYPTO_MANAGER) += cryptomgr.o
12 obj-$(CONFIG_CRYPTO_HMAC) += hmac.o 14 obj-$(CONFIG_CRYPTO_HMAC) += hmac.o
13 obj-$(CONFIG_CRYPTO_NULL) += crypto_null.o 15 obj-$(CONFIG_CRYPTO_NULL) += crypto_null.o
14 obj-$(CONFIG_CRYPTO_MD4) += md4.o 16 obj-$(CONFIG_CRYPTO_MD4) += md4.o
15 obj-$(CONFIG_CRYPTO_MD5) += md5.o 17 obj-$(CONFIG_CRYPTO_MD5) += md5.o
16 obj-$(CONFIG_CRYPTO_SHA1) += sha1.o 18 obj-$(CONFIG_CRYPTO_SHA1) += sha1.o
17 obj-$(CONFIG_CRYPTO_SHA256) += sha256.o 19 obj-$(CONFIG_CRYPTO_SHA256) += sha256.o
18 obj-$(CONFIG_CRYPTO_SHA512) += sha512.o 20 obj-$(CONFIG_CRYPTO_SHA512) += sha512.o
19 obj-$(CONFIG_CRYPTO_WP512) += wp512.o 21 obj-$(CONFIG_CRYPTO_WP512) += wp512.o
20 obj-$(CONFIG_CRYPTO_TGR192) += tgr192.o 22 obj-$(CONFIG_CRYPTO_TGR192) += tgr192.o
21 obj-$(CONFIG_CRYPTO_DES) += des.o 23 obj-$(CONFIG_CRYPTO_DES) += des.o
22 obj-$(CONFIG_CRYPTO_BLOWFISH) += blowfish.o 24 obj-$(CONFIG_CRYPTO_BLOWFISH) += blowfish.o
23 obj-$(CONFIG_CRYPTO_TWOFISH) += twofish.o 25 obj-$(CONFIG_CRYPTO_TWOFISH) += twofish.o
24 obj-$(CONFIG_CRYPTO_TWOFISH_COMMON) += twofish_common.o 26 obj-$(CONFIG_CRYPTO_TWOFISH_COMMON) += twofish_common.o
25 obj-$(CONFIG_CRYPTO_SERPENT) += serpent.o 27 obj-$(CONFIG_CRYPTO_SERPENT) += serpent.o
26 obj-$(CONFIG_CRYPTO_AES) += aes.o 28 obj-$(CONFIG_CRYPTO_AES) += aes.o
27 obj-$(CONFIG_CRYPTO_CAST5) += cast5.o 29 obj-$(CONFIG_CRYPTO_CAST5) += cast5.o
28 obj-$(CONFIG_CRYPTO_CAST6) += cast6.o 30 obj-$(CONFIG_CRYPTO_CAST6) += cast6.o
29 obj-$(CONFIG_CRYPTO_ARC4) += arc4.o 31 obj-$(CONFIG_CRYPTO_ARC4) += arc4.o
30 obj-$(CONFIG_CRYPTO_TEA) += tea.o 32 obj-$(CONFIG_CRYPTO_TEA) += tea.o
31 obj-$(CONFIG_CRYPTO_KHAZAD) += khazad.o 33 obj-$(CONFIG_CRYPTO_KHAZAD) += khazad.o
32 obj-$(CONFIG_CRYPTO_ANUBIS) += anubis.o 34 obj-$(CONFIG_CRYPTO_ANUBIS) += anubis.o
33 obj-$(CONFIG_CRYPTO_DEFLATE) += deflate.o 35 obj-$(CONFIG_CRYPTO_DEFLATE) += deflate.o
34 obj-$(CONFIG_CRYPTO_MICHAEL_MIC) += michael_mic.o 36 obj-$(CONFIG_CRYPTO_MICHAEL_MIC) += michael_mic.o
35 obj-$(CONFIG_CRYPTO_CRC32C) += crc32c.o 37 obj-$(CONFIG_CRYPTO_CRC32C) += crc32c.o
36 38
37 obj-$(CONFIG_CRYPTO_TEST) += tcrypt.o 39 obj-$(CONFIG_CRYPTO_TEST) += tcrypt.o
38 40
File was created 1 /*
2 * Block chaining cipher operations.
3 *
4 * Generic encrypt/decrypt wrapper for ciphers, handles operations across
5 * multiple page boundaries by using temporary blocks. In user context,
6 * the kernel is given a chance to schedule us once per page.
7 *
8 * Copyright (c) 2006 Herbert Xu <herbert@gondor.apana.org.au>
9 *
10 * This program is free software; you can redistribute it and/or modify it
11 * under the terms of the GNU General Public License as published by the Free
12 * Software Foundation; either version 2 of the License, or (at your option)
13 * any later version.
14 *
15 */
16
17 #include <linux/crypto.h>
18 #include <linux/errno.h>
19 #include <linux/kernel.h>
20 #include <linux/io.h>
21 #include <linux/module.h>
22 #include <linux/scatterlist.h>
23 #include <linux/seq_file.h>
24 #include <linux/slab.h>
25 #include <linux/string.h>
26
27 #include "internal.h"
28 #include "scatterwalk.h"
29
30 enum {
31 BLKCIPHER_WALK_PHYS = 1 << 0,
32 BLKCIPHER_WALK_SLOW = 1 << 1,
33 BLKCIPHER_WALK_COPY = 1 << 2,
34 BLKCIPHER_WALK_DIFF = 1 << 3,
35 };
36
37 static int blkcipher_walk_next(struct blkcipher_desc *desc,
38 struct blkcipher_walk *walk);
39 static int blkcipher_walk_first(struct blkcipher_desc *desc,
40 struct blkcipher_walk *walk);
41
42 static inline void blkcipher_map_src(struct blkcipher_walk *walk)
43 {
44 walk->src.virt.addr = scatterwalk_map(&walk->in, 0);
45 }
46
47 static inline void blkcipher_map_dst(struct blkcipher_walk *walk)
48 {
49 walk->dst.virt.addr = scatterwalk_map(&walk->out, 1);
50 }
51
52 static inline void blkcipher_unmap_src(struct blkcipher_walk *walk)
53 {
54 scatterwalk_unmap(walk->src.virt.addr, 0);
55 }
56
57 static inline void blkcipher_unmap_dst(struct blkcipher_walk *walk)
58 {
59 scatterwalk_unmap(walk->dst.virt.addr, 1);
60 }
61
62 static inline u8 *blkcipher_get_spot(u8 *start, unsigned int len)
63 {
64 if (offset_in_page(start + len) < len)
65 return (u8 *)((unsigned long)(start + len) & PAGE_MASK);
66 return start;
67 }
68
69 static inline unsigned int blkcipher_done_slow(struct crypto_blkcipher *tfm,
70 struct blkcipher_walk *walk,
71 unsigned int bsize)
72 {
73 u8 *addr;
74 unsigned int alignmask = crypto_blkcipher_alignmask(tfm);
75
76 addr = (u8 *)ALIGN((unsigned long)walk->buffer, alignmask + 1);
77 addr = blkcipher_get_spot(addr, bsize);
78 scatterwalk_copychunks(addr, &walk->out, bsize, 1);
79 return bsize;
80 }
81
82 static inline unsigned int blkcipher_done_fast(struct blkcipher_walk *walk,
83 unsigned int n)
84 {
85 n = walk->nbytes - n;
86
87 if (walk->flags & BLKCIPHER_WALK_COPY) {
88 blkcipher_map_dst(walk);
89 memcpy(walk->dst.virt.addr, walk->page, n);
90 blkcipher_unmap_dst(walk);
91 } else if (!(walk->flags & BLKCIPHER_WALK_PHYS)) {
92 blkcipher_unmap_src(walk);
93 if (walk->flags & BLKCIPHER_WALK_DIFF)
94 blkcipher_unmap_dst(walk);
95 }
96
97 scatterwalk_advance(&walk->in, n);
98 scatterwalk_advance(&walk->out, n);
99
100 return n;
101 }
102
103 int blkcipher_walk_done(struct blkcipher_desc *desc,
104 struct blkcipher_walk *walk, int err)
105 {
106 struct crypto_blkcipher *tfm = desc->tfm;
107 unsigned int nbytes = 0;
108
109 if (likely(err >= 0)) {
110 unsigned int bsize = crypto_blkcipher_blocksize(tfm);
111 unsigned int n;
112
113 if (likely(!(walk->flags & BLKCIPHER_WALK_SLOW)))
114 n = blkcipher_done_fast(walk, err);
115 else
116 n = blkcipher_done_slow(tfm, walk, bsize);
117
118 nbytes = walk->total - n;
119 err = 0;
120 }
121
122 scatterwalk_done(&walk->in, 0, nbytes);
123 scatterwalk_done(&walk->out, 1, nbytes);
124
125 walk->total = nbytes;
126 walk->nbytes = nbytes;
127
128 if (nbytes) {
129 crypto_yield(desc->flags);
130 return blkcipher_walk_next(desc, walk);
131 }
132
133 if (walk->iv != desc->info)
134 memcpy(desc->info, walk->iv, crypto_blkcipher_ivsize(tfm));
135 if (walk->buffer != walk->page)
136 kfree(walk->buffer);
137 if (walk->page)
138 free_page((unsigned long)walk->page);
139
140 return err;
141 }
142 EXPORT_SYMBOL_GPL(blkcipher_walk_done);
143
144 static inline int blkcipher_next_slow(struct blkcipher_desc *desc,
145 struct blkcipher_walk *walk,
146 unsigned int bsize,
147 unsigned int alignmask)
148 {
149 unsigned int n;
150
151 if (walk->buffer)
152 goto ok;
153
154 walk->buffer = walk->page;
155 if (walk->buffer)
156 goto ok;
157
158 n = bsize * 2 + (alignmask & ~(crypto_tfm_ctx_alignment() - 1));
159 walk->buffer = kmalloc(n, GFP_ATOMIC);
160 if (!walk->buffer)
161 return blkcipher_walk_done(desc, walk, -ENOMEM);
162
163 ok:
164 walk->dst.virt.addr = (u8 *)ALIGN((unsigned long)walk->buffer,
165 alignmask + 1);
166 walk->dst.virt.addr = blkcipher_get_spot(walk->dst.virt.addr, bsize);
167 walk->src.virt.addr = blkcipher_get_spot(walk->dst.virt.addr + bsize,
168 bsize);
169
170 scatterwalk_copychunks(walk->src.virt.addr, &walk->in, bsize, 0);
171
172 walk->nbytes = bsize;
173 walk->flags |= BLKCIPHER_WALK_SLOW;
174
175 return 0;
176 }
177
178 static inline int blkcipher_next_copy(struct blkcipher_walk *walk)
179 {
180 u8 *tmp = walk->page;
181
182 blkcipher_map_src(walk);
183 memcpy(tmp, walk->src.virt.addr, walk->nbytes);
184 blkcipher_unmap_src(walk);
185
186 walk->src.virt.addr = tmp;
187 walk->dst.virt.addr = tmp;
188
189 return 0;
190 }
191
192 static inline int blkcipher_next_fast(struct blkcipher_desc *desc,
193 struct blkcipher_walk *walk)
194 {
195 unsigned long diff;
196
197 walk->src.phys.page = scatterwalk_page(&walk->in);
198 walk->src.phys.offset = offset_in_page(walk->in.offset);
199 walk->dst.phys.page = scatterwalk_page(&walk->out);
200 walk->dst.phys.offset = offset_in_page(walk->out.offset);
201
202 if (walk->flags & BLKCIPHER_WALK_PHYS)
203 return 0;
204
205 diff = walk->src.phys.offset - walk->dst.phys.offset;
206 diff |= walk->src.virt.page - walk->dst.virt.page;
207
208 blkcipher_map_src(walk);
209 walk->dst.virt.addr = walk->src.virt.addr;
210
211 if (diff) {
212 walk->flags |= BLKCIPHER_WALK_DIFF;
213 blkcipher_map_dst(walk);
214 }
215
216 return 0;
217 }
218
219 static int blkcipher_walk_next(struct blkcipher_desc *desc,
220 struct blkcipher_walk *walk)
221 {
222 struct crypto_blkcipher *tfm = desc->tfm;
223 unsigned int alignmask = crypto_blkcipher_alignmask(tfm);
224 unsigned int bsize = crypto_blkcipher_blocksize(tfm);
225 unsigned int n;
226 int err;
227
228 n = walk->total;
229 if (unlikely(n < bsize)) {
230 desc->flags |= CRYPTO_TFM_RES_BAD_BLOCK_LEN;
231 return blkcipher_walk_done(desc, walk, -EINVAL);
232 }
233
234 walk->flags &= ~(BLKCIPHER_WALK_SLOW | BLKCIPHER_WALK_COPY |
235 BLKCIPHER_WALK_DIFF);
236 if (!scatterwalk_aligned(&walk->in, alignmask) ||
237 !scatterwalk_aligned(&walk->out, alignmask)) {
238 walk->flags |= BLKCIPHER_WALK_COPY;
239 if (!walk->page) {
240 walk->page = (void *)__get_free_page(GFP_ATOMIC);
241 if (!walk->page)
242 n = 0;
243 }
244 }
245
246 n = scatterwalk_clamp(&walk->in, n);
247 n = scatterwalk_clamp(&walk->out, n);
248
249 if (unlikely(n < bsize)) {
250 err = blkcipher_next_slow(desc, walk, bsize, alignmask);
251 goto set_phys_lowmem;
252 }
253
254 walk->nbytes = n;
255 if (walk->flags & BLKCIPHER_WALK_COPY) {
256 err = blkcipher_next_copy(walk);
257 goto set_phys_lowmem;
258 }
259
260 return blkcipher_next_fast(desc, walk);
261
262 set_phys_lowmem:
263 if (walk->flags & BLKCIPHER_WALK_PHYS) {
264 walk->src.phys.page = virt_to_page(walk->src.virt.addr);
265 walk->dst.phys.page = virt_to_page(walk->dst.virt.addr);
266 walk->src.phys.offset &= PAGE_SIZE - 1;
267 walk->dst.phys.offset &= PAGE_SIZE - 1;
268 }
269 return err;
270 }
271
272 static inline int blkcipher_copy_iv(struct blkcipher_walk *walk,
273 struct crypto_blkcipher *tfm,
274 unsigned int alignmask)
275 {
276 unsigned bs = crypto_blkcipher_blocksize(tfm);
277 unsigned int ivsize = crypto_blkcipher_ivsize(tfm);
278 unsigned int size = bs * 2 + ivsize + max(bs, ivsize) - (alignmask + 1);
279 u8 *iv;
280
281 size += alignmask & ~(crypto_tfm_ctx_alignment() - 1);
282 walk->buffer = kmalloc(size, GFP_ATOMIC);
283 if (!walk->buffer)
284 return -ENOMEM;
285
286 iv = (u8 *)ALIGN((unsigned long)walk->buffer, alignmask + 1);
287 iv = blkcipher_get_spot(iv, bs) + bs;
288 iv = blkcipher_get_spot(iv, bs) + bs;
289 iv = blkcipher_get_spot(iv, ivsize);
290
291 walk->iv = memcpy(iv, walk->iv, ivsize);
292 return 0;
293 }
294
295 int blkcipher_walk_virt(struct blkcipher_desc *desc,
296 struct blkcipher_walk *walk)
297 {
298 walk->flags &= ~BLKCIPHER_WALK_PHYS;
299 return blkcipher_walk_first(desc, walk);
300 }
301 EXPORT_SYMBOL_GPL(blkcipher_walk_virt);
302
303 int blkcipher_walk_phys(struct blkcipher_desc *desc,
304 struct blkcipher_walk *walk)
305 {
306 walk->flags |= BLKCIPHER_WALK_PHYS;
307 return blkcipher_walk_first(desc, walk);
308 }
309 EXPORT_SYMBOL_GPL(blkcipher_walk_phys);
310
311 static int blkcipher_walk_first(struct blkcipher_desc *desc,
312 struct blkcipher_walk *walk)
313 {
314 struct crypto_blkcipher *tfm = desc->tfm;
315 unsigned int alignmask = crypto_blkcipher_alignmask(tfm);
316
317 walk->nbytes = walk->total;
318 if (unlikely(!walk->total))
319 return 0;
320
321 walk->buffer = NULL;
322 walk->iv = desc->info;
323 if (unlikely(((unsigned long)walk->iv & alignmask))) {
324 int err = blkcipher_copy_iv(walk, tfm, alignmask);
325 if (err)
326 return err;
327 }
328
329 scatterwalk_start(&walk->in, walk->in.sg);
330 scatterwalk_start(&walk->out, walk->out.sg);
331 walk->page = NULL;
332
333 return blkcipher_walk_next(desc, walk);
334 }
335
336 static int setkey(struct crypto_tfm *tfm, const u8 *key,
337 unsigned int keylen)
338 {
339 struct blkcipher_alg *cipher = &tfm->__crt_alg->cra_blkcipher;
340
341 if (keylen < cipher->min_keysize || keylen > cipher->max_keysize) {
342 tfm->crt_flags |= CRYPTO_TFM_RES_BAD_KEY_LEN;
343 return -EINVAL;
344 }
345
346 return cipher->setkey(tfm, key, keylen);
347 }
348
349 static unsigned int crypto_blkcipher_ctxsize(struct crypto_alg *alg)
350 {
351 struct blkcipher_alg *cipher = &alg->cra_blkcipher;
352 unsigned int len = alg->cra_ctxsize;
353
354 if (cipher->ivsize) {
355 len = ALIGN(len, (unsigned long)alg->cra_alignmask + 1);
356 len += cipher->ivsize;
357 }
358
359 return len;
360 }
361
362 static int crypto_init_blkcipher_ops(struct crypto_tfm *tfm)
363 {
364 struct blkcipher_tfm *crt = &tfm->crt_blkcipher;
365 struct blkcipher_alg *alg = &tfm->__crt_alg->cra_blkcipher;
366 unsigned long align = crypto_tfm_alg_alignmask(tfm) + 1;
367 unsigned long addr;
368
369 if (alg->ivsize > PAGE_SIZE / 8)
370 return -EINVAL;
371
372 crt->setkey = setkey;
373 crt->encrypt = alg->encrypt;
374 crt->decrypt = alg->decrypt;
375
376 addr = (unsigned long)crypto_tfm_ctx(tfm);
377 addr = ALIGN(addr, align);
378 addr += ALIGN(tfm->__crt_alg->cra_ctxsize, align);
379 crt->iv = (void *)addr;
380
381 return 0;
382 }
383
384 static void crypto_blkcipher_show(struct seq_file *m, struct crypto_alg *alg)
385 __attribute_used__;
386 static void crypto_blkcipher_show(struct seq_file *m, struct crypto_alg *alg)
387 {
388 seq_printf(m, "type : blkcipher\n");
389 seq_printf(m, "blocksize : %u\n", alg->cra_blocksize);
390 seq_printf(m, "min keysize : %u\n", alg->cra_blkcipher.min_keysize);
391 seq_printf(m, "max keysize : %u\n", alg->cra_blkcipher.max_keysize);
392 seq_printf(m, "ivsize : %u\n", alg->cra_blkcipher.ivsize);
393 }
394
395 const struct crypto_type crypto_blkcipher_type = {
396 .ctxsize = crypto_blkcipher_ctxsize,
397 .init = crypto_init_blkcipher_ops,
398 #ifdef CONFIG_PROC_FS
399 .show = crypto_blkcipher_show,
400 #endif
401 };
402 EXPORT_SYMBOL_GPL(crypto_blkcipher_type);
403
404 MODULE_LICENSE("GPL");
405 MODULE_DESCRIPTION("Generic block chaining cipher type");
406
include/crypto/algapi.h
1 /* 1 /*
2 * Cryptographic API for algorithms (i.e., low-level API). 2 * Cryptographic API for algorithms (i.e., low-level API).
3 * 3 *
4 * Copyright (c) 2006 Herbert Xu <herbert@gondor.apana.org.au> 4 * Copyright (c) 2006 Herbert Xu <herbert@gondor.apana.org.au>
5 * 5 *
6 * This program is free software; you can redistribute it and/or modify it 6 * This program is free software; you can redistribute it and/or modify it
7 * under the terms of the GNU General Public License as published by the Free 7 * under the terms of the GNU General Public License as published by the Free
8 * Software Foundation; either version 2 of the License, or (at your option) 8 * Software Foundation; either version 2 of the License, or (at your option)
9 * any later version. 9 * any later version.
10 * 10 *
11 */ 11 */
12 #ifndef _CRYPTO_ALGAPI_H 12 #ifndef _CRYPTO_ALGAPI_H
13 #define _CRYPTO_ALGAPI_H 13 #define _CRYPTO_ALGAPI_H
14 14
15 #include <linux/crypto.h> 15 #include <linux/crypto.h>
16 16
17 struct module; 17 struct module;
18 struct seq_file; 18 struct seq_file;
19 19
20 struct crypto_type { 20 struct crypto_type {
21 unsigned int (*ctxsize)(struct crypto_alg *alg); 21 unsigned int (*ctxsize)(struct crypto_alg *alg);
22 int (*init)(struct crypto_tfm *tfm); 22 int (*init)(struct crypto_tfm *tfm);
23 void (*exit)(struct crypto_tfm *tfm); 23 void (*exit)(struct crypto_tfm *tfm);
24 void (*show)(struct seq_file *m, struct crypto_alg *alg); 24 void (*show)(struct seq_file *m, struct crypto_alg *alg);
25 }; 25 };
26 26
27 struct crypto_instance { 27 struct crypto_instance {
28 struct crypto_alg alg; 28 struct crypto_alg alg;
29 29
30 struct crypto_template *tmpl; 30 struct crypto_template *tmpl;
31 struct hlist_node list; 31 struct hlist_node list;
32 32
33 void *__ctx[] CRYPTO_MINALIGN_ATTR; 33 void *__ctx[] CRYPTO_MINALIGN_ATTR;
34 }; 34 };
35 35
36 struct crypto_template { 36 struct crypto_template {
37 struct list_head list; 37 struct list_head list;
38 struct hlist_head instances; 38 struct hlist_head instances;
39 struct module *module; 39 struct module *module;
40 40
41 struct crypto_instance *(*alloc)(void *param, unsigned int len); 41 struct crypto_instance *(*alloc)(void *param, unsigned int len);
42 void (*free)(struct crypto_instance *inst); 42 void (*free)(struct crypto_instance *inst);
43 43
44 char name[CRYPTO_MAX_ALG_NAME]; 44 char name[CRYPTO_MAX_ALG_NAME];
45 }; 45 };
46 46
47 struct crypto_spawn { 47 struct crypto_spawn {
48 struct list_head list; 48 struct list_head list;
49 struct crypto_alg *alg; 49 struct crypto_alg *alg;
50 struct crypto_instance *inst; 50 struct crypto_instance *inst;
51 }; 51 };
52 52
53 struct scatter_walk { 53 struct scatter_walk {
54 struct scatterlist *sg; 54 struct scatterlist *sg;
55 unsigned int offset; 55 unsigned int offset;
56 }; 56 };
57 57
58 struct blkcipher_walk {
59 union {
60 struct {
61 struct page *page;
62 unsigned long offset;
63 } phys;
64
65 struct {
66 u8 *page;
67 u8 *addr;
68 } virt;
69 } src, dst;
70
71 struct scatter_walk in;
72 unsigned int nbytes;
73
74 struct scatter_walk out;
75 unsigned int total;
76
77 void *page;
78 u8 *buffer;
79 u8 *iv;
80
81 int flags;
82 };
83
84 extern const struct crypto_type crypto_blkcipher_type;
85
58 int crypto_register_template(struct crypto_template *tmpl); 86 int crypto_register_template(struct crypto_template *tmpl);
59 void crypto_unregister_template(struct crypto_template *tmpl); 87 void crypto_unregister_template(struct crypto_template *tmpl);
60 struct crypto_template *crypto_lookup_template(const char *name); 88 struct crypto_template *crypto_lookup_template(const char *name);
61 89
62 int crypto_init_spawn(struct crypto_spawn *spawn, struct crypto_alg *alg, 90 int crypto_init_spawn(struct crypto_spawn *spawn, struct crypto_alg *alg,
63 struct crypto_instance *inst); 91 struct crypto_instance *inst);
64 void crypto_drop_spawn(struct crypto_spawn *spawn); 92 void crypto_drop_spawn(struct crypto_spawn *spawn);
65 struct crypto_tfm *crypto_spawn_tfm(struct crypto_spawn *spawn); 93 struct crypto_tfm *crypto_spawn_tfm(struct crypto_spawn *spawn);
66 94
67 struct crypto_alg *crypto_get_attr_alg(void *param, unsigned int len, 95 struct crypto_alg *crypto_get_attr_alg(void *param, unsigned int len,
68 u32 type, u32 mask); 96 u32 type, u32 mask);
69 struct crypto_instance *crypto_alloc_instance(const char *name, 97 struct crypto_instance *crypto_alloc_instance(const char *name,
70 struct crypto_alg *alg); 98 struct crypto_alg *alg);
71 99
100 int blkcipher_walk_done(struct blkcipher_desc *desc,
101 struct blkcipher_walk *walk, int err);
102 int blkcipher_walk_virt(struct blkcipher_desc *desc,
103 struct blkcipher_walk *walk);
104 int blkcipher_walk_phys(struct blkcipher_desc *desc,
105 struct blkcipher_walk *walk);
106
107 static inline void *crypto_tfm_ctx_aligned(struct crypto_tfm *tfm)
108 {
109 unsigned long addr = (unsigned long)crypto_tfm_ctx(tfm);
110 unsigned long align = crypto_tfm_alg_alignmask(tfm);
111
112 if (align <= crypto_tfm_ctx_alignment())
113 align = 1;
114 return (void *)ALIGN(addr, align);
115 }
116
72 static inline void *crypto_instance_ctx(struct crypto_instance *inst) 117 static inline void *crypto_instance_ctx(struct crypto_instance *inst)
73 { 118 {
74 return inst->__ctx; 119 return inst->__ctx;
75 } 120 }
76 121
122 static inline void *crypto_blkcipher_ctx(struct crypto_blkcipher *tfm)
123 {
124 return crypto_tfm_ctx(&tfm->base);
125 }
126
127 static inline void *crypto_blkcipher_ctx_aligned(struct crypto_blkcipher *tfm)
128 {
129 return crypto_tfm_ctx_aligned(&tfm->base);
130 }
131
77 static inline struct cipher_alg *crypto_cipher_alg(struct crypto_cipher *tfm) 132 static inline struct cipher_alg *crypto_cipher_alg(struct crypto_cipher *tfm)
78 { 133 {
79 return &crypto_cipher_tfm(tfm)->__crt_alg->cra_cipher; 134 return &crypto_cipher_tfm(tfm)->__crt_alg->cra_cipher;
135 }
136
137 static inline void blkcipher_walk_init(struct blkcipher_walk *walk,
138 struct scatterlist *dst,
139 struct scatterlist *src,
140 unsigned int nbytes)
141 {
142 walk->in.sg = src;
143 walk->out.sg = dst;
144 walk->total = nbytes;
80 } 145 }
81 146
82 #endif /* _CRYPTO_ALGAPI_H */ 147 #endif /* _CRYPTO_ALGAPI_H */
83 148
84 149
include/linux/crypto.h
1 /* 1 /*
2 * Scatterlist Cryptographic API. 2 * Scatterlist Cryptographic API.
3 * 3 *
4 * Copyright (c) 2002 James Morris <jmorris@intercode.com.au> 4 * Copyright (c) 2002 James Morris <jmorris@intercode.com.au>
5 * Copyright (c) 2002 David S. Miller (davem@redhat.com) 5 * Copyright (c) 2002 David S. Miller (davem@redhat.com)
6 * Copyright (c) 2005 Herbert Xu <herbert@gondor.apana.org.au> 6 * Copyright (c) 2005 Herbert Xu <herbert@gondor.apana.org.au>
7 * 7 *
8 * Portions derived from Cryptoapi, by Alexander Kjeldaas <astor@fast.no> 8 * Portions derived from Cryptoapi, by Alexander Kjeldaas <astor@fast.no>
9 * and Nettle, by Niels Mรถller. 9 * and Nettle, by Niels Mรถller.
10 * 10 *
11 * This program is free software; you can redistribute it and/or modify it 11 * This program is free software; you can redistribute it and/or modify it
12 * under the terms of the GNU General Public License as published by the Free 12 * under the terms of the GNU General Public License as published by the Free
13 * Software Foundation; either version 2 of the License, or (at your option) 13 * Software Foundation; either version 2 of the License, or (at your option)
14 * any later version. 14 * any later version.
15 * 15 *
16 */ 16 */
17 #ifndef _LINUX_CRYPTO_H 17 #ifndef _LINUX_CRYPTO_H
18 #define _LINUX_CRYPTO_H 18 #define _LINUX_CRYPTO_H
19 19
20 #include <asm/atomic.h> 20 #include <asm/atomic.h>
21 #include <linux/module.h> 21 #include <linux/module.h>
22 #include <linux/kernel.h> 22 #include <linux/kernel.h>
23 #include <linux/types.h> 23 #include <linux/types.h>
24 #include <linux/list.h> 24 #include <linux/list.h>
25 #include <linux/slab.h> 25 #include <linux/slab.h>
26 #include <linux/string.h> 26 #include <linux/string.h>
27 #include <linux/uaccess.h> 27 #include <linux/uaccess.h>
28 28
29 /* 29 /*
30 * Algorithm masks and types. 30 * Algorithm masks and types.
31 */ 31 */
32 #define CRYPTO_ALG_TYPE_MASK 0x0000000f 32 #define CRYPTO_ALG_TYPE_MASK 0x0000000f
33 #define CRYPTO_ALG_TYPE_CIPHER 0x00000001 33 #define CRYPTO_ALG_TYPE_CIPHER 0x00000001
34 #define CRYPTO_ALG_TYPE_DIGEST 0x00000002 34 #define CRYPTO_ALG_TYPE_DIGEST 0x00000002
35 #define CRYPTO_ALG_TYPE_BLKCIPHER 0x00000003
35 #define CRYPTO_ALG_TYPE_COMPRESS 0x00000004 36 #define CRYPTO_ALG_TYPE_COMPRESS 0x00000004
36 37
37 #define CRYPTO_ALG_LARVAL 0x00000010 38 #define CRYPTO_ALG_LARVAL 0x00000010
38 #define CRYPTO_ALG_DEAD 0x00000020 39 #define CRYPTO_ALG_DEAD 0x00000020
39 #define CRYPTO_ALG_DYING 0x00000040 40 #define CRYPTO_ALG_DYING 0x00000040
40 #define CRYPTO_ALG_ASYNC 0x00000080 41 #define CRYPTO_ALG_ASYNC 0x00000080
41 42
42 /* 43 /*
43 * Transform masks and values (for crt_flags). 44 * Transform masks and values (for crt_flags).
44 */ 45 */
45 #define CRYPTO_TFM_MODE_MASK 0x000000ff 46 #define CRYPTO_TFM_MODE_MASK 0x000000ff
46 #define CRYPTO_TFM_REQ_MASK 0x000fff00 47 #define CRYPTO_TFM_REQ_MASK 0x000fff00
47 #define CRYPTO_TFM_RES_MASK 0xfff00000 48 #define CRYPTO_TFM_RES_MASK 0xfff00000
48 49
49 #define CRYPTO_TFM_MODE_ECB 0x00000001 50 #define CRYPTO_TFM_MODE_ECB 0x00000001
50 #define CRYPTO_TFM_MODE_CBC 0x00000002 51 #define CRYPTO_TFM_MODE_CBC 0x00000002
51 #define CRYPTO_TFM_MODE_CFB 0x00000004 52 #define CRYPTO_TFM_MODE_CFB 0x00000004
52 #define CRYPTO_TFM_MODE_CTR 0x00000008 53 #define CRYPTO_TFM_MODE_CTR 0x00000008
53 54
54 #define CRYPTO_TFM_REQ_WEAK_KEY 0x00000100 55 #define CRYPTO_TFM_REQ_WEAK_KEY 0x00000100
55 #define CRYPTO_TFM_REQ_MAY_SLEEP 0x00000200 56 #define CRYPTO_TFM_REQ_MAY_SLEEP 0x00000200
56 #define CRYPTO_TFM_RES_WEAK_KEY 0x00100000 57 #define CRYPTO_TFM_RES_WEAK_KEY 0x00100000
57 #define CRYPTO_TFM_RES_BAD_KEY_LEN 0x00200000 58 #define CRYPTO_TFM_RES_BAD_KEY_LEN 0x00200000
58 #define CRYPTO_TFM_RES_BAD_KEY_SCHED 0x00400000 59 #define CRYPTO_TFM_RES_BAD_KEY_SCHED 0x00400000
59 #define CRYPTO_TFM_RES_BAD_BLOCK_LEN 0x00800000 60 #define CRYPTO_TFM_RES_BAD_BLOCK_LEN 0x00800000
60 #define CRYPTO_TFM_RES_BAD_FLAGS 0x01000000 61 #define CRYPTO_TFM_RES_BAD_FLAGS 0x01000000
61 62
62 /* 63 /*
63 * Miscellaneous stuff. 64 * Miscellaneous stuff.
64 */ 65 */
65 #define CRYPTO_UNSPEC 0 66 #define CRYPTO_UNSPEC 0
66 #define CRYPTO_MAX_ALG_NAME 64 67 #define CRYPTO_MAX_ALG_NAME 64
67 68
68 #define CRYPTO_DIR_ENCRYPT 1 69 #define CRYPTO_DIR_ENCRYPT 1
69 #define CRYPTO_DIR_DECRYPT 0 70 #define CRYPTO_DIR_DECRYPT 0
70 71
71 /* 72 /*
72 * The macro CRYPTO_MINALIGN_ATTR (along with the void * type in the actual 73 * The macro CRYPTO_MINALIGN_ATTR (along with the void * type in the actual
73 * declaration) is used to ensure that the crypto_tfm context structure is 74 * declaration) is used to ensure that the crypto_tfm context structure is
74 * aligned correctly for the given architecture so that there are no alignment 75 * aligned correctly for the given architecture so that there are no alignment
75 * faults for C data types. In particular, this is required on platforms such 76 * faults for C data types. In particular, this is required on platforms such
76 * as arm where pointers are 32-bit aligned but there are data types such as 77 * as arm where pointers are 32-bit aligned but there are data types such as
77 * u64 which require 64-bit alignment. 78 * u64 which require 64-bit alignment.
78 */ 79 */
79 #if defined(ARCH_KMALLOC_MINALIGN) 80 #if defined(ARCH_KMALLOC_MINALIGN)
80 #define CRYPTO_MINALIGN ARCH_KMALLOC_MINALIGN 81 #define CRYPTO_MINALIGN ARCH_KMALLOC_MINALIGN
81 #elif defined(ARCH_SLAB_MINALIGN) 82 #elif defined(ARCH_SLAB_MINALIGN)
82 #define CRYPTO_MINALIGN ARCH_SLAB_MINALIGN 83 #define CRYPTO_MINALIGN ARCH_SLAB_MINALIGN
83 #endif 84 #endif
84 85
85 #ifdef CRYPTO_MINALIGN 86 #ifdef CRYPTO_MINALIGN
86 #define CRYPTO_MINALIGN_ATTR __attribute__ ((__aligned__(CRYPTO_MINALIGN))) 87 #define CRYPTO_MINALIGN_ATTR __attribute__ ((__aligned__(CRYPTO_MINALIGN)))
87 #else 88 #else
88 #define CRYPTO_MINALIGN_ATTR 89 #define CRYPTO_MINALIGN_ATTR
89 #endif 90 #endif
90 91
91 struct scatterlist; 92 struct scatterlist;
93 struct crypto_blkcipher;
92 struct crypto_tfm; 94 struct crypto_tfm;
93 struct crypto_type; 95 struct crypto_type;
94 96
97 struct blkcipher_desc {
98 struct crypto_blkcipher *tfm;
99 void *info;
100 u32 flags;
101 };
102
95 struct cipher_desc { 103 struct cipher_desc {
96 struct crypto_tfm *tfm; 104 struct crypto_tfm *tfm;
97 void (*crfn)(struct crypto_tfm *tfm, u8 *dst, const u8 *src); 105 void (*crfn)(struct crypto_tfm *tfm, u8 *dst, const u8 *src);
98 unsigned int (*prfn)(const struct cipher_desc *desc, u8 *dst, 106 unsigned int (*prfn)(const struct cipher_desc *desc, u8 *dst,
99 const u8 *src, unsigned int nbytes); 107 const u8 *src, unsigned int nbytes);
100 void *info; 108 void *info;
101 }; 109 };
102 110
103 /* 111 /*
104 * Algorithms: modular crypto algorithm implementations, managed 112 * Algorithms: modular crypto algorithm implementations, managed
105 * via crypto_register_alg() and crypto_unregister_alg(). 113 * via crypto_register_alg() and crypto_unregister_alg().
106 */ 114 */
115 struct blkcipher_alg {
116 int (*setkey)(struct crypto_tfm *tfm, const u8 *key,
117 unsigned int keylen);
118 int (*encrypt)(struct blkcipher_desc *desc,
119 struct scatterlist *dst, struct scatterlist *src,
120 unsigned int nbytes);
121 int (*decrypt)(struct blkcipher_desc *desc,
122 struct scatterlist *dst, struct scatterlist *src,
123 unsigned int nbytes);
124
125 unsigned int min_keysize;
126 unsigned int max_keysize;
127 unsigned int ivsize;
128 };
129
107 struct cipher_alg { 130 struct cipher_alg {
108 unsigned int cia_min_keysize; 131 unsigned int cia_min_keysize;
109 unsigned int cia_max_keysize; 132 unsigned int cia_max_keysize;
110 int (*cia_setkey)(struct crypto_tfm *tfm, const u8 *key, 133 int (*cia_setkey)(struct crypto_tfm *tfm, const u8 *key,
111 unsigned int keylen); 134 unsigned int keylen);
112 void (*cia_encrypt)(struct crypto_tfm *tfm, u8 *dst, const u8 *src); 135 void (*cia_encrypt)(struct crypto_tfm *tfm, u8 *dst, const u8 *src);
113 void (*cia_decrypt)(struct crypto_tfm *tfm, u8 *dst, const u8 *src); 136 void (*cia_decrypt)(struct crypto_tfm *tfm, u8 *dst, const u8 *src);
114 137
115 unsigned int (*cia_encrypt_ecb)(const struct cipher_desc *desc, 138 unsigned int (*cia_encrypt_ecb)(const struct cipher_desc *desc,
116 u8 *dst, const u8 *src, 139 u8 *dst, const u8 *src,
117 unsigned int nbytes); 140 unsigned int nbytes);
118 unsigned int (*cia_decrypt_ecb)(const struct cipher_desc *desc, 141 unsigned int (*cia_decrypt_ecb)(const struct cipher_desc *desc,
119 u8 *dst, const u8 *src, 142 u8 *dst, const u8 *src,
120 unsigned int nbytes); 143 unsigned int nbytes);
121 unsigned int (*cia_encrypt_cbc)(const struct cipher_desc *desc, 144 unsigned int (*cia_encrypt_cbc)(const struct cipher_desc *desc,
122 u8 *dst, const u8 *src, 145 u8 *dst, const u8 *src,
123 unsigned int nbytes); 146 unsigned int nbytes);
124 unsigned int (*cia_decrypt_cbc)(const struct cipher_desc *desc, 147 unsigned int (*cia_decrypt_cbc)(const struct cipher_desc *desc,
125 u8 *dst, const u8 *src, 148 u8 *dst, const u8 *src,
126 unsigned int nbytes); 149 unsigned int nbytes);
127 }; 150 };
128 151
129 struct digest_alg { 152 struct digest_alg {
130 unsigned int dia_digestsize; 153 unsigned int dia_digestsize;
131 void (*dia_init)(struct crypto_tfm *tfm); 154 void (*dia_init)(struct crypto_tfm *tfm);
132 void (*dia_update)(struct crypto_tfm *tfm, const u8 *data, 155 void (*dia_update)(struct crypto_tfm *tfm, const u8 *data,
133 unsigned int len); 156 unsigned int len);
134 void (*dia_final)(struct crypto_tfm *tfm, u8 *out); 157 void (*dia_final)(struct crypto_tfm *tfm, u8 *out);
135 int (*dia_setkey)(struct crypto_tfm *tfm, const u8 *key, 158 int (*dia_setkey)(struct crypto_tfm *tfm, const u8 *key,
136 unsigned int keylen); 159 unsigned int keylen);
137 }; 160 };
138 161
139 struct compress_alg { 162 struct compress_alg {
140 int (*coa_compress)(struct crypto_tfm *tfm, const u8 *src, 163 int (*coa_compress)(struct crypto_tfm *tfm, const u8 *src,
141 unsigned int slen, u8 *dst, unsigned int *dlen); 164 unsigned int slen, u8 *dst, unsigned int *dlen);
142 int (*coa_decompress)(struct crypto_tfm *tfm, const u8 *src, 165 int (*coa_decompress)(struct crypto_tfm *tfm, const u8 *src,
143 unsigned int slen, u8 *dst, unsigned int *dlen); 166 unsigned int slen, u8 *dst, unsigned int *dlen);
144 }; 167 };
145 168
169 #define cra_blkcipher cra_u.blkcipher
146 #define cra_cipher cra_u.cipher 170 #define cra_cipher cra_u.cipher
147 #define cra_digest cra_u.digest 171 #define cra_digest cra_u.digest
148 #define cra_compress cra_u.compress 172 #define cra_compress cra_u.compress
149 173
150 struct crypto_alg { 174 struct crypto_alg {
151 struct list_head cra_list; 175 struct list_head cra_list;
152 struct list_head cra_users; 176 struct list_head cra_users;
153 177
154 u32 cra_flags; 178 u32 cra_flags;
155 unsigned int cra_blocksize; 179 unsigned int cra_blocksize;
156 unsigned int cra_ctxsize; 180 unsigned int cra_ctxsize;
157 unsigned int cra_alignmask; 181 unsigned int cra_alignmask;
158 182
159 int cra_priority; 183 int cra_priority;
160 atomic_t cra_refcnt; 184 atomic_t cra_refcnt;
161 185
162 char cra_name[CRYPTO_MAX_ALG_NAME]; 186 char cra_name[CRYPTO_MAX_ALG_NAME];
163 char cra_driver_name[CRYPTO_MAX_ALG_NAME]; 187 char cra_driver_name[CRYPTO_MAX_ALG_NAME];
164 188
165 const struct crypto_type *cra_type; 189 const struct crypto_type *cra_type;
166 190
167 union { 191 union {
192 struct blkcipher_alg blkcipher;
168 struct cipher_alg cipher; 193 struct cipher_alg cipher;
169 struct digest_alg digest; 194 struct digest_alg digest;
170 struct compress_alg compress; 195 struct compress_alg compress;
171 } cra_u; 196 } cra_u;
172 197
173 int (*cra_init)(struct crypto_tfm *tfm); 198 int (*cra_init)(struct crypto_tfm *tfm);
174 void (*cra_exit)(struct crypto_tfm *tfm); 199 void (*cra_exit)(struct crypto_tfm *tfm);
175 void (*cra_destroy)(struct crypto_alg *alg); 200 void (*cra_destroy)(struct crypto_alg *alg);
176 201
177 struct module *cra_module; 202 struct module *cra_module;
178 }; 203 };
179 204
180 /* 205 /*
181 * Algorithm registration interface. 206 * Algorithm registration interface.
182 */ 207 */
183 int crypto_register_alg(struct crypto_alg *alg); 208 int crypto_register_alg(struct crypto_alg *alg);
184 int crypto_unregister_alg(struct crypto_alg *alg); 209 int crypto_unregister_alg(struct crypto_alg *alg);
185 210
186 /* 211 /*
187 * Algorithm query interface. 212 * Algorithm query interface.
188 */ 213 */
189 #ifdef CONFIG_CRYPTO 214 #ifdef CONFIG_CRYPTO
190 int crypto_alg_available(const char *name, u32 flags); 215 int crypto_alg_available(const char *name, u32 flags);
191 #else 216 #else
192 static inline int crypto_alg_available(const char *name, u32 flags) 217 static inline int crypto_alg_available(const char *name, u32 flags)
193 { 218 {
194 return 0; 219 return 0;
195 } 220 }
196 #endif 221 #endif
197 222
198 /* 223 /*
199 * Transforms: user-instantiated objects which encapsulate algorithms 224 * Transforms: user-instantiated objects which encapsulate algorithms
200 * and core processing logic. Managed via crypto_alloc_*() and 225 * and core processing logic. Managed via crypto_alloc_*() and
201 * crypto_free_*(), as well as the various helpers below. 226 * crypto_free_*(), as well as the various helpers below.
202 */ 227 */
203 228
229 struct blkcipher_tfm {
230 void *iv;
231 int (*setkey)(struct crypto_tfm *tfm, const u8 *key,
232 unsigned int keylen);
233 int (*encrypt)(struct blkcipher_desc *desc, struct scatterlist *dst,
234 struct scatterlist *src, unsigned int nbytes);
235 int (*decrypt)(struct blkcipher_desc *desc, struct scatterlist *dst,
236 struct scatterlist *src, unsigned int nbytes);
237 };
238
204 struct cipher_tfm { 239 struct cipher_tfm {
205 void *cit_iv; 240 void *cit_iv;
206 unsigned int cit_ivsize; 241 unsigned int cit_ivsize;
207 u32 cit_mode; 242 u32 cit_mode;
208 int (*cit_setkey)(struct crypto_tfm *tfm, 243 int (*cit_setkey)(struct crypto_tfm *tfm,
209 const u8 *key, unsigned int keylen); 244 const u8 *key, unsigned int keylen);
210 int (*cit_encrypt)(struct crypto_tfm *tfm, 245 int (*cit_encrypt)(struct crypto_tfm *tfm,
211 struct scatterlist *dst, 246 struct scatterlist *dst,
212 struct scatterlist *src, 247 struct scatterlist *src,
213 unsigned int nbytes); 248 unsigned int nbytes);
214 int (*cit_encrypt_iv)(struct crypto_tfm *tfm, 249 int (*cit_encrypt_iv)(struct crypto_tfm *tfm,
215 struct scatterlist *dst, 250 struct scatterlist *dst,
216 struct scatterlist *src, 251 struct scatterlist *src,
217 unsigned int nbytes, u8 *iv); 252 unsigned int nbytes, u8 *iv);
218 int (*cit_decrypt)(struct crypto_tfm *tfm, 253 int (*cit_decrypt)(struct crypto_tfm *tfm,
219 struct scatterlist *dst, 254 struct scatterlist *dst,
220 struct scatterlist *src, 255 struct scatterlist *src,
221 unsigned int nbytes); 256 unsigned int nbytes);
222 int (*cit_decrypt_iv)(struct crypto_tfm *tfm, 257 int (*cit_decrypt_iv)(struct crypto_tfm *tfm,
223 struct scatterlist *dst, 258 struct scatterlist *dst,
224 struct scatterlist *src, 259 struct scatterlist *src,
225 unsigned int nbytes, u8 *iv); 260 unsigned int nbytes, u8 *iv);
226 void (*cit_xor_block)(u8 *dst, const u8 *src); 261 void (*cit_xor_block)(u8 *dst, const u8 *src);
227 void (*cit_encrypt_one)(struct crypto_tfm *tfm, u8 *dst, const u8 *src); 262 void (*cit_encrypt_one)(struct crypto_tfm *tfm, u8 *dst, const u8 *src);
228 void (*cit_decrypt_one)(struct crypto_tfm *tfm, u8 *dst, const u8 *src); 263 void (*cit_decrypt_one)(struct crypto_tfm *tfm, u8 *dst, const u8 *src);
229 }; 264 };
230 265
231 struct digest_tfm { 266 struct digest_tfm {
232 void (*dit_init)(struct crypto_tfm *tfm); 267 void (*dit_init)(struct crypto_tfm *tfm);
233 void (*dit_update)(struct crypto_tfm *tfm, 268 void (*dit_update)(struct crypto_tfm *tfm,
234 struct scatterlist *sg, unsigned int nsg); 269 struct scatterlist *sg, unsigned int nsg);
235 void (*dit_final)(struct crypto_tfm *tfm, u8 *out); 270 void (*dit_final)(struct crypto_tfm *tfm, u8 *out);
236 void (*dit_digest)(struct crypto_tfm *tfm, struct scatterlist *sg, 271 void (*dit_digest)(struct crypto_tfm *tfm, struct scatterlist *sg,
237 unsigned int nsg, u8 *out); 272 unsigned int nsg, u8 *out);
238 int (*dit_setkey)(struct crypto_tfm *tfm, 273 int (*dit_setkey)(struct crypto_tfm *tfm,
239 const u8 *key, unsigned int keylen); 274 const u8 *key, unsigned int keylen);
240 #ifdef CONFIG_CRYPTO_HMAC 275 #ifdef CONFIG_CRYPTO_HMAC
241 void *dit_hmac_block; 276 void *dit_hmac_block;
242 #endif 277 #endif
243 }; 278 };
244 279
245 struct compress_tfm { 280 struct compress_tfm {
246 int (*cot_compress)(struct crypto_tfm *tfm, 281 int (*cot_compress)(struct crypto_tfm *tfm,
247 const u8 *src, unsigned int slen, 282 const u8 *src, unsigned int slen,
248 u8 *dst, unsigned int *dlen); 283 u8 *dst, unsigned int *dlen);
249 int (*cot_decompress)(struct crypto_tfm *tfm, 284 int (*cot_decompress)(struct crypto_tfm *tfm,
250 const u8 *src, unsigned int slen, 285 const u8 *src, unsigned int slen,
251 u8 *dst, unsigned int *dlen); 286 u8 *dst, unsigned int *dlen);
252 }; 287 };
253 288
289 #define crt_blkcipher crt_u.blkcipher
254 #define crt_cipher crt_u.cipher 290 #define crt_cipher crt_u.cipher
255 #define crt_digest crt_u.digest 291 #define crt_digest crt_u.digest
256 #define crt_compress crt_u.compress 292 #define crt_compress crt_u.compress
257 293
258 struct crypto_tfm { 294 struct crypto_tfm {
259 295
260 u32 crt_flags; 296 u32 crt_flags;
261 297
262 union { 298 union {
299 struct blkcipher_tfm blkcipher;
263 struct cipher_tfm cipher; 300 struct cipher_tfm cipher;
264 struct digest_tfm digest; 301 struct digest_tfm digest;
265 struct compress_tfm compress; 302 struct compress_tfm compress;
266 } crt_u; 303 } crt_u;
267 304
268 struct crypto_alg *__crt_alg; 305 struct crypto_alg *__crt_alg;
269 306
270 void *__crt_ctx[] CRYPTO_MINALIGN_ATTR; 307 void *__crt_ctx[] CRYPTO_MINALIGN_ATTR;
271 }; 308 };
272 309
273 #define crypto_cipher crypto_tfm 310 #define crypto_cipher crypto_tfm
274 311
312 struct crypto_blkcipher {
313 struct crypto_tfm base;
314 };
315
275 enum { 316 enum {
276 CRYPTOA_UNSPEC, 317 CRYPTOA_UNSPEC,
277 CRYPTOA_ALG, 318 CRYPTOA_ALG,
278 }; 319 };
279 320
280 struct crypto_attr_alg { 321 struct crypto_attr_alg {
281 char name[CRYPTO_MAX_ALG_NAME]; 322 char name[CRYPTO_MAX_ALG_NAME];
282 }; 323 };
283 324
284 /* 325 /*
285 * Transform user interface. 326 * Transform user interface.
286 */ 327 */
287 328
288 struct crypto_tfm *crypto_alloc_tfm(const char *alg_name, u32 tfm_flags); 329 struct crypto_tfm *crypto_alloc_tfm(const char *alg_name, u32 tfm_flags);
289 struct crypto_tfm *crypto_alloc_base(const char *alg_name, u32 type, u32 mask); 330 struct crypto_tfm *crypto_alloc_base(const char *alg_name, u32 type, u32 mask);
290 void crypto_free_tfm(struct crypto_tfm *tfm); 331 void crypto_free_tfm(struct crypto_tfm *tfm);
291 332
292 /* 333 /*
293 * Transform helpers which query the underlying algorithm. 334 * Transform helpers which query the underlying algorithm.
294 */ 335 */
295 static inline const char *crypto_tfm_alg_name(struct crypto_tfm *tfm) 336 static inline const char *crypto_tfm_alg_name(struct crypto_tfm *tfm)
296 { 337 {
297 return tfm->__crt_alg->cra_name; 338 return tfm->__crt_alg->cra_name;
298 } 339 }
299 340
300 static inline const char *crypto_tfm_alg_driver_name(struct crypto_tfm *tfm) 341 static inline const char *crypto_tfm_alg_driver_name(struct crypto_tfm *tfm)
301 { 342 {
302 return tfm->__crt_alg->cra_driver_name; 343 return tfm->__crt_alg->cra_driver_name;
303 } 344 }
304 345
305 static inline int crypto_tfm_alg_priority(struct crypto_tfm *tfm) 346 static inline int crypto_tfm_alg_priority(struct crypto_tfm *tfm)
306 { 347 {
307 return tfm->__crt_alg->cra_priority; 348 return tfm->__crt_alg->cra_priority;
308 } 349 }
309 350
310 static inline const char *crypto_tfm_alg_modname(struct crypto_tfm *tfm) 351 static inline const char *crypto_tfm_alg_modname(struct crypto_tfm *tfm)
311 { 352 {
312 return module_name(tfm->__crt_alg->cra_module); 353 return module_name(tfm->__crt_alg->cra_module);
313 } 354 }
314 355
315 static inline u32 crypto_tfm_alg_type(struct crypto_tfm *tfm) 356 static inline u32 crypto_tfm_alg_type(struct crypto_tfm *tfm)
316 { 357 {
317 return tfm->__crt_alg->cra_flags & CRYPTO_ALG_TYPE_MASK; 358 return tfm->__crt_alg->cra_flags & CRYPTO_ALG_TYPE_MASK;
318 } 359 }
319 360
320 static inline unsigned int crypto_tfm_alg_min_keysize(struct crypto_tfm *tfm) 361 static inline unsigned int crypto_tfm_alg_min_keysize(struct crypto_tfm *tfm)
321 { 362 {
322 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 363 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
323 return tfm->__crt_alg->cra_cipher.cia_min_keysize; 364 return tfm->__crt_alg->cra_cipher.cia_min_keysize;
324 } 365 }
325 366
326 static inline unsigned int crypto_tfm_alg_max_keysize(struct crypto_tfm *tfm) 367 static inline unsigned int crypto_tfm_alg_max_keysize(struct crypto_tfm *tfm)
327 { 368 {
328 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 369 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
329 return tfm->__crt_alg->cra_cipher.cia_max_keysize; 370 return tfm->__crt_alg->cra_cipher.cia_max_keysize;
330 } 371 }
331 372
332 static inline unsigned int crypto_tfm_alg_ivsize(struct crypto_tfm *tfm) 373 static inline unsigned int crypto_tfm_alg_ivsize(struct crypto_tfm *tfm)
333 { 374 {
334 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 375 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
335 return tfm->crt_cipher.cit_ivsize; 376 return tfm->crt_cipher.cit_ivsize;
336 } 377 }
337 378
338 static inline unsigned int crypto_tfm_alg_blocksize(struct crypto_tfm *tfm) 379 static inline unsigned int crypto_tfm_alg_blocksize(struct crypto_tfm *tfm)
339 { 380 {
340 return tfm->__crt_alg->cra_blocksize; 381 return tfm->__crt_alg->cra_blocksize;
341 } 382 }
342 383
343 static inline unsigned int crypto_tfm_alg_digestsize(struct crypto_tfm *tfm) 384 static inline unsigned int crypto_tfm_alg_digestsize(struct crypto_tfm *tfm)
344 { 385 {
345 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST); 386 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST);
346 return tfm->__crt_alg->cra_digest.dia_digestsize; 387 return tfm->__crt_alg->cra_digest.dia_digestsize;
347 } 388 }
348 389
349 static inline unsigned int crypto_tfm_alg_alignmask(struct crypto_tfm *tfm) 390 static inline unsigned int crypto_tfm_alg_alignmask(struct crypto_tfm *tfm)
350 { 391 {
351 return tfm->__crt_alg->cra_alignmask; 392 return tfm->__crt_alg->cra_alignmask;
352 } 393 }
353 394
354 static inline u32 crypto_tfm_get_flags(struct crypto_tfm *tfm) 395 static inline u32 crypto_tfm_get_flags(struct crypto_tfm *tfm)
355 { 396 {
356 return tfm->crt_flags; 397 return tfm->crt_flags;
357 } 398 }
358 399
359 static inline void crypto_tfm_set_flags(struct crypto_tfm *tfm, u32 flags) 400 static inline void crypto_tfm_set_flags(struct crypto_tfm *tfm, u32 flags)
360 { 401 {
361 tfm->crt_flags |= flags; 402 tfm->crt_flags |= flags;
362 } 403 }
363 404
364 static inline void crypto_tfm_clear_flags(struct crypto_tfm *tfm, u32 flags) 405 static inline void crypto_tfm_clear_flags(struct crypto_tfm *tfm, u32 flags)
365 { 406 {
366 tfm->crt_flags &= ~flags; 407 tfm->crt_flags &= ~flags;
367 } 408 }
368 409
369 static inline void *crypto_tfm_ctx(struct crypto_tfm *tfm) 410 static inline void *crypto_tfm_ctx(struct crypto_tfm *tfm)
370 { 411 {
371 return tfm->__crt_ctx; 412 return tfm->__crt_ctx;
372 } 413 }
373 414
374 static inline unsigned int crypto_tfm_ctx_alignment(void) 415 static inline unsigned int crypto_tfm_ctx_alignment(void)
375 { 416 {
376 struct crypto_tfm *tfm; 417 struct crypto_tfm *tfm;
377 return __alignof__(tfm->__crt_ctx); 418 return __alignof__(tfm->__crt_ctx);
378 } 419 }
379 420
380 /* 421 /*
381 * API wrappers. 422 * API wrappers.
382 */ 423 */
424 static inline struct crypto_blkcipher *__crypto_blkcipher_cast(
425 struct crypto_tfm *tfm)
426 {
427 return (struct crypto_blkcipher *)tfm;
428 }
429
430 static inline struct crypto_blkcipher *crypto_blkcipher_cast(
431 struct crypto_tfm *tfm)
432 {
433 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_BLKCIPHER);
434 return __crypto_blkcipher_cast(tfm);
435 }
436
437 static inline struct crypto_blkcipher *crypto_alloc_blkcipher(
438 const char *alg_name, u32 type, u32 mask)
439 {
440 type &= ~CRYPTO_ALG_TYPE_MASK;
441 type |= CRYPTO_ALG_TYPE_BLKCIPHER;
442 mask |= CRYPTO_ALG_TYPE_MASK;
443
444 return __crypto_blkcipher_cast(crypto_alloc_base(alg_name, type, mask));
445 }
446
447 static inline struct crypto_tfm *crypto_blkcipher_tfm(
448 struct crypto_blkcipher *tfm)
449 {
450 return &tfm->base;
451 }
452
453 static inline void crypto_free_blkcipher(struct crypto_blkcipher *tfm)
454 {
455 crypto_free_tfm(crypto_blkcipher_tfm(tfm));
456 }
457
458 static inline const char *crypto_blkcipher_name(struct crypto_blkcipher *tfm)
459 {
460 return crypto_tfm_alg_name(crypto_blkcipher_tfm(tfm));
461 }
462
463 static inline struct blkcipher_tfm *crypto_blkcipher_crt(
464 struct crypto_blkcipher *tfm)
465 {
466 return &crypto_blkcipher_tfm(tfm)->crt_blkcipher;
467 }
468
469 static inline struct blkcipher_alg *crypto_blkcipher_alg(
470 struct crypto_blkcipher *tfm)
471 {
472 return &crypto_blkcipher_tfm(tfm)->__crt_alg->cra_blkcipher;
473 }
474
475 static inline unsigned int crypto_blkcipher_ivsize(struct crypto_blkcipher *tfm)
476 {
477 return crypto_blkcipher_alg(tfm)->ivsize;
478 }
479
480 static inline unsigned int crypto_blkcipher_blocksize(
481 struct crypto_blkcipher *tfm)
482 {
483 return crypto_tfm_alg_blocksize(crypto_blkcipher_tfm(tfm));
484 }
485
486 static inline unsigned int crypto_blkcipher_alignmask(
487 struct crypto_blkcipher *tfm)
488 {
489 return crypto_tfm_alg_alignmask(crypto_blkcipher_tfm(tfm));
490 }
491
492 static inline u32 crypto_blkcipher_get_flags(struct crypto_blkcipher *tfm)
493 {
494 return crypto_tfm_get_flags(crypto_blkcipher_tfm(tfm));
495 }
496
497 static inline void crypto_blkcipher_set_flags(struct crypto_blkcipher *tfm,
498 u32 flags)
499 {
500 crypto_tfm_set_flags(crypto_blkcipher_tfm(tfm), flags);
501 }
502
503 static inline void crypto_blkcipher_clear_flags(struct crypto_blkcipher *tfm,
504 u32 flags)
505 {
506 crypto_tfm_clear_flags(crypto_blkcipher_tfm(tfm), flags);
507 }
508
509 static inline int crypto_blkcipher_setkey(struct crypto_blkcipher *tfm,
510 const u8 *key, unsigned int keylen)
511 {
512 return crypto_blkcipher_crt(tfm)->setkey(crypto_blkcipher_tfm(tfm),
513 key, keylen);
514 }
515
516 static inline int crypto_blkcipher_encrypt(struct blkcipher_desc *desc,
517 struct scatterlist *dst,
518 struct scatterlist *src,
519 unsigned int nbytes)
520 {
521 desc->info = crypto_blkcipher_crt(desc->tfm)->iv;
522 return crypto_blkcipher_crt(desc->tfm)->encrypt(desc, dst, src, nbytes);
523 }
524
525 static inline int crypto_blkcipher_encrypt_iv(struct blkcipher_desc *desc,
526 struct scatterlist *dst,
527 struct scatterlist *src,
528 unsigned int nbytes)
529 {
530 return crypto_blkcipher_crt(desc->tfm)->encrypt(desc, dst, src, nbytes);
531 }
532
533 static inline int crypto_blkcipher_decrypt(struct blkcipher_desc *desc,
534 struct scatterlist *dst,
535 struct scatterlist *src,
536 unsigned int nbytes)
537 {
538 desc->info = crypto_blkcipher_crt(desc->tfm)->iv;
539 return crypto_blkcipher_crt(desc->tfm)->decrypt(desc, dst, src, nbytes);
540 }
541
542 static inline int crypto_blkcipher_decrypt_iv(struct blkcipher_desc *desc,
543 struct scatterlist *dst,
544 struct scatterlist *src,
545 unsigned int nbytes)
546 {
547 return crypto_blkcipher_crt(desc->tfm)->decrypt(desc, dst, src, nbytes);
548 }
549
550 static inline void crypto_blkcipher_set_iv(struct crypto_blkcipher *tfm,
551 const u8 *src, unsigned int len)
552 {
553 memcpy(crypto_blkcipher_crt(tfm)->iv, src, len);
554 }
555
556 static inline void crypto_blkcipher_get_iv(struct crypto_blkcipher *tfm,
557 u8 *dst, unsigned int len)
558 {
559 memcpy(dst, crypto_blkcipher_crt(tfm)->iv, len);
560 }
561
383 static inline struct crypto_cipher *__crypto_cipher_cast(struct crypto_tfm *tfm) 562 static inline struct crypto_cipher *__crypto_cipher_cast(struct crypto_tfm *tfm)
384 { 563 {
385 return (struct crypto_cipher *)tfm; 564 return (struct crypto_cipher *)tfm;
386 } 565 }
387 566
388 static inline struct crypto_cipher *crypto_cipher_cast(struct crypto_tfm *tfm) 567 static inline struct crypto_cipher *crypto_cipher_cast(struct crypto_tfm *tfm)
389 { 568 {
390 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 569 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
391 return __crypto_cipher_cast(tfm); 570 return __crypto_cipher_cast(tfm);
392 } 571 }
393 572
394 static inline struct crypto_cipher *crypto_alloc_cipher(const char *alg_name, 573 static inline struct crypto_cipher *crypto_alloc_cipher(const char *alg_name,
395 u32 type, u32 mask) 574 u32 type, u32 mask)
396 { 575 {
397 type &= ~CRYPTO_ALG_TYPE_MASK; 576 type &= ~CRYPTO_ALG_TYPE_MASK;
398 type |= CRYPTO_ALG_TYPE_CIPHER; 577 type |= CRYPTO_ALG_TYPE_CIPHER;
399 mask |= CRYPTO_ALG_TYPE_MASK; 578 mask |= CRYPTO_ALG_TYPE_MASK;
400 579
401 return __crypto_cipher_cast(crypto_alloc_base(alg_name, type, mask)); 580 return __crypto_cipher_cast(crypto_alloc_base(alg_name, type, mask));
402 } 581 }
403 582
404 static inline struct crypto_tfm *crypto_cipher_tfm(struct crypto_cipher *tfm) 583 static inline struct crypto_tfm *crypto_cipher_tfm(struct crypto_cipher *tfm)
405 { 584 {
406 return tfm; 585 return tfm;
407 } 586 }
408 587
409 static inline void crypto_free_cipher(struct crypto_cipher *tfm) 588 static inline void crypto_free_cipher(struct crypto_cipher *tfm)
410 { 589 {
411 crypto_free_tfm(crypto_cipher_tfm(tfm)); 590 crypto_free_tfm(crypto_cipher_tfm(tfm));
412 } 591 }
413 592
414 static inline struct cipher_tfm *crypto_cipher_crt(struct crypto_cipher *tfm) 593 static inline struct cipher_tfm *crypto_cipher_crt(struct crypto_cipher *tfm)
415 { 594 {
416 return &crypto_cipher_tfm(tfm)->crt_cipher; 595 return &crypto_cipher_tfm(tfm)->crt_cipher;
417 } 596 }
418 597
419 static inline unsigned int crypto_cipher_blocksize(struct crypto_cipher *tfm) 598 static inline unsigned int crypto_cipher_blocksize(struct crypto_cipher *tfm)
420 { 599 {
421 return crypto_tfm_alg_blocksize(crypto_cipher_tfm(tfm)); 600 return crypto_tfm_alg_blocksize(crypto_cipher_tfm(tfm));
422 } 601 }
423 602
424 static inline unsigned int crypto_cipher_alignmask(struct crypto_cipher *tfm) 603 static inline unsigned int crypto_cipher_alignmask(struct crypto_cipher *tfm)
425 { 604 {
426 return crypto_tfm_alg_alignmask(crypto_cipher_tfm(tfm)); 605 return crypto_tfm_alg_alignmask(crypto_cipher_tfm(tfm));
427 } 606 }
428 607
429 static inline u32 crypto_cipher_get_flags(struct crypto_cipher *tfm) 608 static inline u32 crypto_cipher_get_flags(struct crypto_cipher *tfm)
430 { 609 {
431 return crypto_tfm_get_flags(crypto_cipher_tfm(tfm)); 610 return crypto_tfm_get_flags(crypto_cipher_tfm(tfm));
432 } 611 }
433 612
434 static inline void crypto_cipher_set_flags(struct crypto_cipher *tfm, 613 static inline void crypto_cipher_set_flags(struct crypto_cipher *tfm,
435 u32 flags) 614 u32 flags)
436 { 615 {
437 crypto_tfm_set_flags(crypto_cipher_tfm(tfm), flags); 616 crypto_tfm_set_flags(crypto_cipher_tfm(tfm), flags);
438 } 617 }
439 618
440 static inline void crypto_cipher_clear_flags(struct crypto_cipher *tfm, 619 static inline void crypto_cipher_clear_flags(struct crypto_cipher *tfm,
441 u32 flags) 620 u32 flags)
442 { 621 {
443 crypto_tfm_clear_flags(crypto_cipher_tfm(tfm), flags); 622 crypto_tfm_clear_flags(crypto_cipher_tfm(tfm), flags);
444 } 623 }
445 624
446 static inline void crypto_cipher_encrypt_one(struct crypto_cipher *tfm, 625 static inline void crypto_cipher_encrypt_one(struct crypto_cipher *tfm,
447 u8 *dst, const u8 *src) 626 u8 *dst, const u8 *src)
448 { 627 {
449 crypto_cipher_crt(tfm)->cit_encrypt_one(crypto_cipher_tfm(tfm), 628 crypto_cipher_crt(tfm)->cit_encrypt_one(crypto_cipher_tfm(tfm),
450 dst, src); 629 dst, src);
451 } 630 }
452 631
453 static inline void crypto_cipher_decrypt_one(struct crypto_cipher *tfm, 632 static inline void crypto_cipher_decrypt_one(struct crypto_cipher *tfm,
454 u8 *dst, const u8 *src) 633 u8 *dst, const u8 *src)
455 { 634 {
456 crypto_cipher_crt(tfm)->cit_decrypt_one(crypto_cipher_tfm(tfm), 635 crypto_cipher_crt(tfm)->cit_decrypt_one(crypto_cipher_tfm(tfm),
457 dst, src); 636 dst, src);
458 } 637 }
459 638
460 static inline void crypto_digest_init(struct crypto_tfm *tfm) 639 static inline void crypto_digest_init(struct crypto_tfm *tfm)
461 { 640 {
462 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST); 641 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST);
463 tfm->crt_digest.dit_init(tfm); 642 tfm->crt_digest.dit_init(tfm);
464 } 643 }
465 644
466 static inline void crypto_digest_update(struct crypto_tfm *tfm, 645 static inline void crypto_digest_update(struct crypto_tfm *tfm,
467 struct scatterlist *sg, 646 struct scatterlist *sg,
468 unsigned int nsg) 647 unsigned int nsg)
469 { 648 {
470 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST); 649 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST);
471 tfm->crt_digest.dit_update(tfm, sg, nsg); 650 tfm->crt_digest.dit_update(tfm, sg, nsg);
472 } 651 }
473 652
474 static inline void crypto_digest_final(struct crypto_tfm *tfm, u8 *out) 653 static inline void crypto_digest_final(struct crypto_tfm *tfm, u8 *out)
475 { 654 {
476 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST); 655 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST);
477 tfm->crt_digest.dit_final(tfm, out); 656 tfm->crt_digest.dit_final(tfm, out);
478 } 657 }
479 658
480 static inline void crypto_digest_digest(struct crypto_tfm *tfm, 659 static inline void crypto_digest_digest(struct crypto_tfm *tfm,
481 struct scatterlist *sg, 660 struct scatterlist *sg,
482 unsigned int nsg, u8 *out) 661 unsigned int nsg, u8 *out)
483 { 662 {
484 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST); 663 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST);
485 tfm->crt_digest.dit_digest(tfm, sg, nsg, out); 664 tfm->crt_digest.dit_digest(tfm, sg, nsg, out);
486 } 665 }
487 666
488 static inline int crypto_digest_setkey(struct crypto_tfm *tfm, 667 static inline int crypto_digest_setkey(struct crypto_tfm *tfm,
489 const u8 *key, unsigned int keylen) 668 const u8 *key, unsigned int keylen)
490 { 669 {
491 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST); 670 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_DIGEST);
492 return tfm->crt_digest.dit_setkey(tfm, key, keylen); 671 return tfm->crt_digest.dit_setkey(tfm, key, keylen);
493 } 672 }
494 673
495 static inline int crypto_cipher_setkey(struct crypto_tfm *tfm, 674 static inline int crypto_cipher_setkey(struct crypto_tfm *tfm,
496 const u8 *key, unsigned int keylen) 675 const u8 *key, unsigned int keylen)
497 { 676 {
498 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 677 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
499 return tfm->crt_cipher.cit_setkey(tfm, key, keylen); 678 return tfm->crt_cipher.cit_setkey(tfm, key, keylen);
500 } 679 }
501 680
502 static inline int crypto_cipher_encrypt(struct crypto_tfm *tfm, 681 static inline int crypto_cipher_encrypt(struct crypto_tfm *tfm,
503 struct scatterlist *dst, 682 struct scatterlist *dst,
504 struct scatterlist *src, 683 struct scatterlist *src,
505 unsigned int nbytes) 684 unsigned int nbytes)
506 { 685 {
507 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 686 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
508 return tfm->crt_cipher.cit_encrypt(tfm, dst, src, nbytes); 687 return tfm->crt_cipher.cit_encrypt(tfm, dst, src, nbytes);
509 } 688 }
510 689
511 static inline int crypto_cipher_encrypt_iv(struct crypto_tfm *tfm, 690 static inline int crypto_cipher_encrypt_iv(struct crypto_tfm *tfm,
512 struct scatterlist *dst, 691 struct scatterlist *dst,
513 struct scatterlist *src, 692 struct scatterlist *src,
514 unsigned int nbytes, u8 *iv) 693 unsigned int nbytes, u8 *iv)
515 { 694 {
516 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 695 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
517 return tfm->crt_cipher.cit_encrypt_iv(tfm, dst, src, nbytes, iv); 696 return tfm->crt_cipher.cit_encrypt_iv(tfm, dst, src, nbytes, iv);
518 } 697 }
519 698
520 static inline int crypto_cipher_decrypt(struct crypto_tfm *tfm, 699 static inline int crypto_cipher_decrypt(struct crypto_tfm *tfm,
521 struct scatterlist *dst, 700 struct scatterlist *dst,
522 struct scatterlist *src, 701 struct scatterlist *src,
523 unsigned int nbytes) 702 unsigned int nbytes)
524 { 703 {
525 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 704 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
526 return tfm->crt_cipher.cit_decrypt(tfm, dst, src, nbytes); 705 return tfm->crt_cipher.cit_decrypt(tfm, dst, src, nbytes);
527 } 706 }
528 707
529 static inline int crypto_cipher_decrypt_iv(struct crypto_tfm *tfm, 708 static inline int crypto_cipher_decrypt_iv(struct crypto_tfm *tfm,
530 struct scatterlist *dst, 709 struct scatterlist *dst,
531 struct scatterlist *src, 710 struct scatterlist *src,
532 unsigned int nbytes, u8 *iv) 711 unsigned int nbytes, u8 *iv)
533 { 712 {
534 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 713 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
535 return tfm->crt_cipher.cit_decrypt_iv(tfm, dst, src, nbytes, iv); 714 return tfm->crt_cipher.cit_decrypt_iv(tfm, dst, src, nbytes, iv);
536 } 715 }
537 716
538 static inline void crypto_cipher_set_iv(struct crypto_tfm *tfm, 717 static inline void crypto_cipher_set_iv(struct crypto_tfm *tfm,
539 const u8 *src, unsigned int len) 718 const u8 *src, unsigned int len)
540 { 719 {
541 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 720 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
542 memcpy(tfm->crt_cipher.cit_iv, src, len); 721 memcpy(tfm->crt_cipher.cit_iv, src, len);
543 } 722 }
544 723
545 static inline void crypto_cipher_get_iv(struct crypto_tfm *tfm, 724 static inline void crypto_cipher_get_iv(struct crypto_tfm *tfm,
546 u8 *dst, unsigned int len) 725 u8 *dst, unsigned int len)
547 { 726 {
548 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER); 727 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_CIPHER);
549 memcpy(dst, tfm->crt_cipher.cit_iv, len); 728 memcpy(dst, tfm->crt_cipher.cit_iv, len);
550 } 729 }
551 730
552 static inline int crypto_comp_compress(struct crypto_tfm *tfm, 731 static inline int crypto_comp_compress(struct crypto_tfm *tfm,
553 const u8 *src, unsigned int slen, 732 const u8 *src, unsigned int slen,
554 u8 *dst, unsigned int *dlen) 733 u8 *dst, unsigned int *dlen)
555 { 734 {
556 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_COMPRESS); 735 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_COMPRESS);
557 return tfm->crt_compress.cot_compress(tfm, src, slen, dst, dlen); 736 return tfm->crt_compress.cot_compress(tfm, src, slen, dst, dlen);
558 } 737 }
559 738
560 static inline int crypto_comp_decompress(struct crypto_tfm *tfm, 739 static inline int crypto_comp_decompress(struct crypto_tfm *tfm,
561 const u8 *src, unsigned int slen, 740 const u8 *src, unsigned int slen,
562 u8 *dst, unsigned int *dlen) 741 u8 *dst, unsigned int *dlen)
563 { 742 {
564 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_COMPRESS); 743 BUG_ON(crypto_tfm_alg_type(tfm) != CRYPTO_ALG_TYPE_COMPRESS);
565 return tfm->crt_compress.cot_decompress(tfm, src, slen, dst, dlen); 744 return tfm->crt_compress.cot_decompress(tfm, src, slen, dst, dlen);
566 } 745 }
567 746
568 /* 747 /*
569 * HMAC support. 748 * HMAC support.
570 */ 749 */
571 #ifdef CONFIG_CRYPTO_HMAC 750 #ifdef CONFIG_CRYPTO_HMAC
572 void crypto_hmac_init(struct crypto_tfm *tfm, u8 *key, unsigned int *keylen); 751 void crypto_hmac_init(struct crypto_tfm *tfm, u8 *key, unsigned int *keylen);
573 void crypto_hmac_update(struct crypto_tfm *tfm, 752 void crypto_hmac_update(struct crypto_tfm *tfm,
574 struct scatterlist *sg, unsigned int nsg); 753 struct scatterlist *sg, unsigned int nsg);
575 void crypto_hmac_final(struct crypto_tfm *tfm, u8 *key, 754 void crypto_hmac_final(struct crypto_tfm *tfm, u8 *key,
576 unsigned int *keylen, u8 *out); 755 unsigned int *keylen, u8 *out);
577 void crypto_hmac(struct crypto_tfm *tfm, u8 *key, unsigned int *keylen, 756 void crypto_hmac(struct crypto_tfm *tfm, u8 *key, unsigned int *keylen,
578 struct scatterlist *sg, unsigned int nsg, u8 *out); 757 struct scatterlist *sg, unsigned int nsg, u8 *out);
579 #endif /* CONFIG_CRYPTO_HMAC */ 758 #endif /* CONFIG_CRYPTO_HMAC */
580 759
581 #endif /* _LINUX_CRYPTO_H */ 760 #endif /* _LINUX_CRYPTO_H */
582 761
583 762