25 Sep, 2020

1 commit


19 Jun, 2019

1 commit

  • Based on 1 normalized pattern(s):

    this program is free software you can redistribute it and or modify
    it under the terms of the gnu general public license version 2 as
    published by the free software foundation this program is
    distributed in the hope that it will be useful but without any
    warranty without even the implied warranty of merchantability or
    fitness for a particular purpose see the gnu general public license
    for more details you should have received a copy of the gnu general
    public license along with this program if not see http www gnu org
    licenses

    extracted by the scancode license scanner the SPDX license identifier

    GPL-2.0-only

    has been chosen to replace the boilerplate/reference in 503 file(s).

    Signed-off-by: Thomas Gleixner
    Reviewed-by: Alexios Zavras
    Reviewed-by: Allison Randal
    Reviewed-by: Enrico Weigelt
    Cc: linux-spdx@vger.kernel.org
    Link: https://lkml.kernel.org/r/20190602204653.811534538@linutronix.de
    Signed-off-by: Greg Kroah-Hartman

    Thomas Gleixner
     

18 Apr, 2019

1 commit

  • Use subsys_initcall for registration of all templates and generic
    algorithm implementations, rather than module_init. Then change
    cryptomgr to use arch_initcall, to place it before the subsys_initcalls.

    This is needed so that when both a generic and optimized implementation
    of an algorithm are built into the kernel (not loadable modules), the
    generic implementation is registered before the optimized one.
    Otherwise, the self-tests for the optimized implementation are unable to
    allocate the generic implementation for the new comparison fuzz tests.

    Note that on arm, a side effect of this change is that self-tests for
    generic implementations may run before the unaligned access handler has
    been installed. So, unaligned accesses will crash the kernel. This is
    arguably a good thing as it makes it easier to detect that type of bug.

    Signed-off-by: Eric Biggers
    Signed-off-by: Herbert Xu

    Eric Biggers
     

10 Jan, 2019

1 commit

  • sm3_compress() calls rol32() with shift >= 32, which causes undefined
    behavior. This is easily detected by enabling CONFIG_UBSAN.

    Explicitly AND with 31 to make the behavior well defined.

    Fixes: 4f0fc1600edb ("crypto: sm3 - add OSCCA SM3 secure hash")
    Cc: # v4.15+
    Cc: Gilad Ben-Yossef
    Signed-off-by: Eric Biggers
    Signed-off-by: Herbert Xu

    Eric Biggers
     

09 Jul, 2018

1 commit

  • Many shash algorithms set .cra_flags = CRYPTO_ALG_TYPE_SHASH. But this
    is redundant with the C structure type ('struct shash_alg'), and
    crypto_register_shash() already sets the type flag automatically,
    clearing any type flag that was already there. Apparently the useless
    assignment has just been copy+pasted around.

    So, remove the useless assignment from all the shash algorithms.

    This patch shouldn't change any actual behavior.

    Signed-off-by: Eric Biggers
    Signed-off-by: Herbert Xu

    Eric Biggers
     

22 Sep, 2017

1 commit