#!/bin/bash user="user" password="password" # $1 : jira number # $2 : log to echo # $3 : time # $4 : jira log comment logTime () { echo "#######################################" echo "Logging $2" echo "#######################################" curl -k --request POST \ --url "https://jira.toto.fr/rest/api/2/issue/$1/worklog" \ --user "$user:$password" \ --header 'Accept: application/json' \ --header 'Content-Type: application/json' \ --data '{ "comment": "'"$4"'", "timeSpent": "'"$3"'" }' } logTime "Jira-number" "log jira" "time (format : 15m, 4h 3d ...)" "jira comment" logTime ... logTime ...
Wednesday, November 13, 2019
Log time on JIRA with REST API
If you get the chance to work with people that want you to log everything you do on jira, this little script can help:
Monday, August 19, 2019
Daily 2.0 : one browser extension that keep you up to date
I was searching for ways to find information on development. I was first told of Medium which is a nice way to get that.
There is a also nice browser extension called Daily 2.0 that basically does the same thing. You just need to add it to your favorite browser, enter your preferences by tags and then read a lot :)
You can find the chrome extension here
Tuesday, August 6, 2019
What should I say when the customer asks for direct data access ?
On my project, we were asked to open our database to the customer. At first I knew it was bad but I didn't have the argument to answer. I finally came to this mail.
They are currently asking customers what they really need to see. I see improvment :)
-----------------------
Hello @all,
In the backlog, we have this user story : As a user, I can request my data on a database copy.
I will need to know what are the reasons for this request. What information do users need?
In general, a request like this one comes from a lack of functionality in the application.
To rephrase, the question would be: What would these users need to have in the application to avoid having direct access to the database (replicated or not)?
If the answer is: "they already have access on their current application", it is not a good answer. :D
I have several problems with this:
- Security on the exposed data
- Performance risk (anyone can make risky SQL queries, even read-only ones)
- Data and performance control is no longer the responsibility of the informatic service alone
- It requires training these users on our data model
- It requires support when they can't do what they want
- It requires to change the data exposed when the model evolves
- Risk of having to justify ourself to the business on the data structure
- No guarantee that they use the extracted information correctly
- No tests on the exposed data
Topics that talk about the subject:
https://security.stackexchange.com/questions/175344/what-are-the-risks-of-allowing-business-users-direct-production-database-access
https://stackoverflow.com/questions/1559892/why-shouldnt-i-give-outsiders-access-to-my-database
Seen from below, it looks like a reporting / export / API exposure demand that has evolved because we don't know what we want to do.
Did the informatic service already authorized this type of demand before? If so, it's amazing.
Thank you for your answers.
They are currently asking customers what they really need to see. I see improvment :)
-----------------------
Hello @all,
In the backlog, we have this user story : As a user, I can request my data on a database copy.
I will need to know what are the reasons for this request. What information do users need?
In general, a request like this one comes from a lack of functionality in the application.
To rephrase, the question would be: What would these users need to have in the application to avoid having direct access to the database (replicated or not)?
If the answer is: "they already have access on their current application", it is not a good answer. :D
I have several problems with this:
- Security on the exposed data
- Performance risk (anyone can make risky SQL queries, even read-only ones)
- Data and performance control is no longer the responsibility of the informatic service alone
- It requires training these users on our data model
- It requires support when they can't do what they want
- It requires to change the data exposed when the model evolves
- Risk of having to justify ourself to the business on the data structure
- No guarantee that they use the extracted information correctly
- No tests on the exposed data
Topics that talk about the subject:
https://security.stackexchange.com/questions/175344/what-are-the-risks-of-allowing-business-users-direct-production-database-access
https://stackoverflow.com/questions/1559892/why-shouldnt-i-give-outsiders-access-to-my-database
Seen from below, it looks like a reporting / export / API exposure demand that has evolved because we don't know what we want to do.
Did the informatic service already authorized this type of demand before? If so, it's amazing.
Thank you for your answers.
Thursday, July 18, 2019
Clean you mailbox with Cleanfox
If you want to make your world a better place (Miss u MJ) and/or if you just want to get rid of all those newsletters you keep receiving, you can try Cleanfox.
It will automatically detect newsletters / spam and will let you choose what you want to remove. It can also unsubscribe you from those newsletters.
It can be use on all main mailboxes an is easy to use.
Trust this nice french company. ;)
It will automatically detect newsletters / spam and will let you choose what you want to remove. It can also unsubscribe you from those newsletters.
It can be use on all main mailboxes an is easy to use.
Trust this nice french company. ;)
Thursday, July 4, 2019
|| operator, a coalesce like in Javascript
When I first encounter coalesce in SQL, I was amazed by the simplicity and the usefullness of the function. So I searched for it in other languages.
The closest I get in Javascript was to use the || operator. Using it on multiple operand will result in taking the first non null, non undefined, non false variable. See done below:
The closest I get in Javascript was to use the || operator. Using it on multiple operand will result in taking the first non null, non undefined, non false variable. See done below:
const a = null; const b = false; const c = 8; a || b || c; 8 const d = true; a || d || c; true const e = undefined; a || e || c; 8
Tuesday, July 2, 2019
Remove leading and trailing spaces on Strings across your spring application with Jackson
If you are using Jackson to convert http messages, you can add a custom String Deserializer to removes heading and trailing spaces across you application in a few lines.
import com.fasterxml.jackson.core.JsonParser; import com.fasterxml.jackson.databind.DeserializationContext; import com.fasterxml.jackson.databind.deser.std.StdDeserializer; import org.apache.commons.lang3.StringUtils; import java.io.IOException; public class StringDeserializer extends StdDeserializer<string> { private static final long serialVersionUID = 1623333240815834335L; public StringDeserializer() { this(null); } private StringDeserializer(Class vc) { super(vc); } @Override public String deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException { String valueAsString = jp.getValueAsString(); if (StringUtils.isEmpty(valueAsString)) { return null; } return valueAsString.trim(); } }To activate it, add this bean to your WebConfig file.
@Configuration @EnableWebMvc public class WebConfig implements WebMvcConfigurer { @Bean public MappingJackson2HttpMessageConverter mappingJackson2HttpMessageConverter() { MappingJackson2HttpMessageConverter jsonConverter = new MappingJackson2HttpMessageConverter(); ObjectMapper objectMapper = new ObjectMapper(); objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false); objectMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false); objectMapper.setVisibility(PropertyAccessor.FIELD, Visibility.ANY); jsonConverter.setObjectMapper(objectMapper); SimpleModule module = new SimpleModule(); module.addDeserializer(String.class, new StringDeserializer()); objectMapper.registerModule(module); return jsonConverter; } }
Angular app: Add unit test coverage on Sonar
We will consider here that you already have an angular application running with some unit tests based on karma / jasmine.
To see the coverage, we will need a few things :
We will also need to find a launcher (Chrome, PhantomJs ...). We chose the second one in order not to rely on the plateform.
First things first, we will update our karma.conf.js
The karma sonarqube Reporter will generate unitTestsReport.xml which contains unit tests information (number, success, fail, elapsed time ...).
(If you want to generate html report immediatly in your workspace, you can use Karma Html Reporter).
You can now launch tests using angular-cli
$ ng test --code-coverage
02 07 2019 13:45:10.192:INFO [karma-server]: Karma v3.1.3 server started at http://0.0.0.0:9876/
02 07 2019 13:45:10.195:INFO [launcher]: Launching browsers PhantomJS with concurrency unlimited
02 07 2019 13:45:10.201:INFO [launcher]: Starting browser PhantomJS
02 07 2019 13:45:18.845:INFO [PhantomJS 2.1.1 (Windows 7.0.0)]: Connected on socket DAOh28181XJnSgGxAAAA with id 31414509
TOTAL: 21 SUCCESS
If you are using typescript, you will probably need to set the property "sourceMap" to true in the "test" section of your angular.json file. Without it, when compiling from TypeScript to JavaScript, you will get lcov error because the report contains the line numbers for the JavaScript code. (See here).
You might also encounter some problems with PhantomJs. I had those weird errors although it was working fine with Chrome.
phantomjs Attempting to configure 'style' with descriptor '{"enumerable":true,"configurable":true}'
evaluating 'b.style._clear(a.propertyName(c)) phantomjs
I had to edit the polyfill.js (use to make your application compatible on multiple browsers).
Switching the import of web-animations-js and zone.js was the solution.
It is now time to tell sonar how to handle these informations. Add a file named sonar-project.properties to your workspace.
sonar.projectKey=project-key
sonar.projectName=project-name
sonar.sources=src
sonar.tests=src/app
sonar.exclusions=**/node_modules/**,**/*.spec.ts,**/coverage/**
sonar.coverage.exclusions=src/*.ts, src/*.js, **/*Dto.ts,**/*Enum.ts,**/*constants.ts, **/*module.ts, **/*config.ts
sonar.test.inclusions=**/*.spec.ts
sonar.typescript.lcov.reportPaths=coverage/lcov.info
sonar.testExecutionReportPaths=reports/unitTestsReport.xml
If you want to run test before pushing, you can use Husky and add test on pre-push.
"husky": {
"hooks": {
"pre-push": "ng test"
}
}
To see the coverage, we will need a few things :
npm install karma-coverage-istanbul-reporter --save-dev
npm install karma-sonarqube-reporter --save-dev
First things first, we will update our karma.conf.js
module.exports = function(config) { config.set({ basePath: '', frameworks: ['jasmine', '@angular-devkit/build-angular'], plugins: [ require('@angular-devkit/build-angular/plugins/karma'), require('karma-jasmine'), require('karma-phantomjs-launcher'), require('karma-coverage-istanbul-reporter'), require('karma-sonarqube-reporter') ], coverageIstanbulReporter: { dir: require('path').join(__dirname, '../coverage'), reports: ['lcovonly'], fixWebpackSourcePaths: true }, sonarqubeReporter : { basePath: 'src/app', filePattern: '**/*spec.ts', encoding: 'utf-8', outputFolder: 'reports', reportName: () => { return 'unitTestsReport.xml'; } }, reporters: ['progress', 'sonarqube'], port: 9876, logLevel: config.LOG_INFO, browsers: ['PhantomJS'], singleRun: true }); };The karma coverage istanbul reporter will generate the lcov.info file which contains code coverage information in an LCOV formatted file.
The karma sonarqube Reporter will generate unitTestsReport.xml which contains unit tests information (number, success, fail, elapsed time ...).
(If you want to generate html report immediatly in your workspace, you can use Karma Html Reporter).
You can now launch tests using angular-cli
$ ng test --code-coverage
02 07 2019 13:45:10.192:INFO [karma-server]: Karma v3.1.3 server started at http://0.0.0.0:9876/
02 07 2019 13:45:10.195:INFO [launcher]: Launching browsers PhantomJS with concurrency unlimited
02 07 2019 13:45:10.201:INFO [launcher]: Starting browser PhantomJS
02 07 2019 13:45:18.845:INFO [PhantomJS 2.1.1 (Windows 7.0.0)]: Connected on socket DAOh28181XJnSgGxAAAA with id 31414509
TOTAL: 21 SUCCESS
If you are using typescript, you will probably need to set the property "sourceMap" to true in the "test" section of your angular.json file. Without it, when compiling from TypeScript to JavaScript, you will get lcov error because the report contains the line numbers for the JavaScript code. (See here).
You might also encounter some problems with PhantomJs. I had those weird errors although it was working fine with Chrome.
phantomjs Attempting to configure 'style' with descriptor '{"enumerable":true,"configurable":true}'
evaluating 'b.style._clear(a.propertyName(c)) phantomjs
I had to edit the polyfill.js (use to make your application compatible on multiple browsers).
Switching the import of web-animations-js and zone.js was the solution.
It is now time to tell sonar how to handle these informations. Add a file named sonar-project.properties to your workspace.
sonar.projectKey=project-key
sonar.projectName=project-name
sonar.sources=src
sonar.tests=src/app
sonar.exclusions=**/node_modules/**,**/*.spec.ts,**/coverage/**
sonar.coverage.exclusions=src/*.ts, src/*.js, **/*Dto.ts,**/*Enum.ts,**/*constants.ts, **/*module.ts, **/*config.ts
sonar.test.inclusions=**/*.spec.ts
sonar.typescript.lcov.reportPaths=coverage/lcov.info
sonar.testExecutionReportPaths=reports/unitTestsReport.xml
To watch the result on the sonar server, run it in command line (path : sonarQubePath/bin/yourOs). It will be accessible on http:localhost:9000
Add the SonarQubeScanner to your path and then run the command sonar-scanner in the base directory of your project. It will find the sonar-project.properties and send information to the sonarQube server.
Now you can find your coverage information !
If you want to run test before pushing, you can use Husky and add test on pre-push.
"husky": {
"hooks": {
"pre-push": "ng test"
}
}
Labels:
JASMINE,
JAVASCRIPT,
JENKINS,
KARMA,
SONAR,
TYPESCRIPT
Tuesday, June 25, 2019
Serveur mail portable : FakeSMTP
Afin de tester les envoies de mail en local, il existe FakeSMTP qui permet d'avoir un serveur mail cross platform pour les tests.
Pour le lancer, se placer dans le répertoire d'installation et lancer :
Pour le lancer, se placer dans le répertoire d'installation et lancer :
java -jar fakeSMTP-2.0.jarLe port d'écoute est configurable (25 par défaut).
"Encourager" le lancement des tests avant un push
Sur un projet, nous avions des soucis concernant le nombre de build KO sur Jenkins à cause des tests non lancés sur notre back-end. Après avoir essayé le "on va faire attention", nous avons utilisé le script suivant au pre-push (hook Git).
Il se base sur la présence du fichier jacoco-tests.exec (pour SONAR) qui est généré lors du clean install. On vérifie que le fichier est présent et que sa dernière date de modification est inférieure à 20 minutes (pour ceux qui lançait bien le clean install avant de push, il n'y aura pas de changement dans leurs habitudes)
Pas la meilleure solution donc, but it works.
Il se base sur la présence du fichier jacoco-tests.exec (pour SONAR) qui est généré lors du clean install. On vérifie que le fichier est présent et que sa dernière date de modification est inférieure à 20 minutes (pour ceux qui lançait bien le clean install avant de push, il n'y aura pas de changement dans leurs habitudes)
Pas la meilleure solution donc, but it works.
#!/bin/bash # FilePaths to testing files generated by clean install testsFile=./target/jacoco-tests.exec run_clean_install () { echo "Running Maven clean install before push" mvn clean install if [ $? -ne 0 ]; then "Error while testing the code, cleaning project" rm "$testsFile" exit 1 fi } if [ ! -f "$testsFile" ]; then echo "File $testsFile not found!" run_clean_install else epochTwentyMinutes=1200 currentEpoch=`/bin/date +%s` testsFileEpoch=`/bin/date -r "$testsFile" +%s` timePassedTestsFileEpoch=`expr $currentEpoch - $testsFileEpoch` if test "$timePassedTestsFileEpoch" -gt "$epochTwentyMinutes"; then echo "Obsoletes tests" run_clean_install fi fi rm "$testsFile" echo "Pre-push hook passed"
Script à placer dans /repertoire_projet/.git/hooks/pre-push
Création de macro "CLEAN CODE" avec Intellij
Le plugin SaveAction permet de faire des opérations lors de la sauvegarde ou via un raccourci.
Ayant eu des soucis avec le plugin et ne voulant pas faire d'opérations à chaque sauvegarde j'ai décidé de passer par une macro.
Dans le menu -> edit -> Macros, il est possible d'enregirster des combinaisons de raccourcis.
Par exemple, ajouter OptimizeImports, ReformatCode et SilentCodeCleanup.
Il suffit ensuite d'ajouter un raccourci sur la macro et TADA !!
Ayant eu des soucis avec le plugin et ne voulant pas faire d'opérations à chaque sauvegarde j'ai décidé de passer par une macro.
Dans le menu -> edit -> Macros, il est possible d'enregirster des combinaisons de raccourcis.
Par exemple, ajouter OptimizeImports, ReformatCode et SilentCodeCleanup.
Il suffit ensuite d'ajouter un raccourci sur la macro et TADA !!
Monday, June 24, 2019
Test unitaire des exceptions avec JUnit
Pour tester en détail une exception dans un test Junit, j'utilisais jusqu'ici expected qui teste uniquement qu'une exception du type est envoyé.
L'utilisation de @Rule et ExpectedException permet de faire des tests plus poussés.
Cependant, dans les cas ou nous souhaitons tester qu'une méthode n'est pas appelée (à cause de l'exception) via verify, il faudra revenir à un try / catch classique.
Pour Junit 5
Pour activer JUnit 5 sur votre projet spring boot, suivez ce lien.
Avec JUnit 5, le test des exceptions devient simple et complet avec assertThrows.
Voici un exemple:
On a le meilleur des deux mondes. On récupère l'exception que l'on peut tester et on a toujours la main après pour faire d'autres assert / verify.
@Test(expected = 'MyException.class')Pour Junit 4
L'utilisation de @Rule et ExpectedException permet de faire des tests plus poussés.
@Rule public ExpectedException expectedEx = ExpectedException.none(); @Test public void shouldThrowRuntimeExceptionWhenEmployeeIDisNull() throws Exception { expectedEx.expect(RuntimeException.class); expectedEx.expectMessage("Employee ID is null"); // do something that should throw the exception... }Ou encore tester directement l'exception remontée.
@Test public void shouldThrowMyExceptionWhenMyMockedServiceThrow() throws Exception { MyException myException = Mockito.mock(MyException.class); Mockito.when(myMockedService.myMockMethod()).thenThrow(myException); expectedEx.expect(Is.is(myException)); myInjectedMockService.myInjectedMockMethod(); }
Cependant, dans les cas ou nous souhaitons tester qu'une méthode n'est pas appelée (à cause de l'exception) via verify, il faudra revenir à un try / catch classique.
Pour Junit 5
Pour activer JUnit 5 sur votre projet spring boot, suivez ce lien.
Avec JUnit 5, le test des exceptions devient simple et complet avec assertThrows.
Voici un exemple:
@Test public void shouldThrowMyExceptionWhenMyMockedServiceThrow() throws Exception { MyException myException = Mockito.mock(MyException.class); Mockito.when(myMockedService.myMockMethod()).thenThrow(myException); MyException thrownException = assertThrows(MyException.class, () -> myInjectedMockService.myInjectedMockMethod()); assertEquals(myException, thrownException); verify(myOtherService, times(0)).otherMethod(); }
On a le meilleur des deux mondes. On récupère l'exception que l'on peut tester et on a toujours la main après pour faire d'autres assert / verify.
Subscribe to:
Posts (Atom)