<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Aymen Rebouh on Medium]]></title>
        <description><![CDATA[Stories by Aymen Rebouh on Medium]]></description>
        <link>https://medium.com/@aymen.rebouh?source=rss-82ceb7481588------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Fri, 24 Apr 2026 05:52:55 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@aymen.rebouh/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[How We Built an API Authorization System with a Protobuf Plugin and AI Assistance from proto file…]]></title>
            <link>https://medium.com/eureka-engineering/how-we-built-an-api-authorization-system-with-a-protobuf-plugin-and-ai-assistance-from-proto-file-eba661cdad47?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/eba661cdad47</guid>
            <category><![CDATA[grpc-gateway]]></category>
            <category><![CDATA[golang]]></category>
            <category><![CDATA[grpc]]></category>
            <category><![CDATA[protobuf]]></category>
            <category><![CDATA[protoc]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Mon, 08 Dec 2025 02:34:39 GMT</pubDate>
            <atom:updated>2025-12-08T02:34:39.834Z</atom:updated>
            <content:encoded><![CDATA[<h3>How We Built an API Authorization System with a Protobuf Plugin and AI Assistance from proto file annotations</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Yehn_wwV92gwQq0k23OrEA.png" /><figcaption>Who Should Read This Article?</figcaption></figure><h3>Introduction</h3><p>This article is the continuation of the “Eureka Advent Calendar 2025”.</p><blockquote>We built a protobuf plugin that automatically generates authorization maps from proto file annotations, enabling seamless role-based permission management for gRPC/HTTP APIs.</blockquote><h3>Who Should Read This Article?</h3><p>This article is ideal for:</p><ul><li><strong>Backend engineers</strong> looking to implement authorization systems in microservices architectures, particularly those working with gRPC and protobuf.</li><li><strong>Platform engineers </strong>interested in code generation tools.</li><li><strong>Go developers</strong> curious about the protobuf ecosystem, code generation patterns, and building custom protoc plugins.</li></ul><p><strong>Prerequisites: </strong>Basic familiarity with APIs, Go programming, and general understanding of microservices concepts. No strong prior protobuf or gRPC experience required. I may explain these along the way briefly.</p><p>At <a href="https://eure.jp/">Eureka Inc</a>, we were building a new API service that needed role-based authorization. Managers required full permissions (create, read, update,<br> delete) while regular users needed only read access.</p><p>Rather than maintaining separate authorization configuration files, we wanted our API permissions to live directly alongside our API definitions in<br> our protobuf files, a true single source of truth.</p><p>This article shows how we built a custom protobuf compiler plugin with AI assistance to automatically generate authorization code from proto<br> annotations.</p><p><strong>The technologies and frameworks involved</strong></p><ul><li><strong>protobuf (Protocol Buffers) </strong>is a system for serializing structured data developed by Google. It utilizes a specialized Interface Definition Language (IDL) to define the structure of data and service interfaces.</li><li><strong>gRPC</strong>, a Remote Procedure Call (RPC) framework that can run in any environment and can efficiently connect services. It also allows us to have API documentation and server code interfaces automatically generated.</li><li>On top of that, we added<strong> gRPC Gateway</strong>, a plugin of the Google protocol buffers compiler <a href="https://github.com/protocolbuffers/protobuf">protoc</a>. It reads protobuf service definitions and generates a reverse-proxy server which translates a RESTful HTTP API into gRPC.</li></ul><p>Since we were already using protobuf as a way to generate OpenAPI documentation automatically whenever new endpoints were added in the proto files, we thought that having the permissions written directly in those proto files and having the permissions generated automatically at the same time would be quite convenient for engineers, so we decided to give it a try.</p><p>Once our solution was implemented, we ended up having an automatically generated map of endpoints and permissions ready to be used in our application that looks like this ⬇️</p><pre>syntax = &quot;proto3&quot;;<br><br>service TestService {<br>  rpc TestNoPermissions(TestNoPermissionsRequest) returns (TestNoPermissionsResponse) {<br>    option (google.api.http) = {<br>      post: &quot;/v1/test/{foo_id}&quot;<br>      body: &quot;*&quot;<br>    };<br>    // No auth is required for this endpoint<br>    option (proto.v1.authz) = {no_auth_required: true};<br>  }<br><br>  rpc TestWithPermissions(TestWithPermissionsRequest) returns (TestWithPermissionsResponse) {<br>    option (google.api.http) = {<br>      post: &quot;/v1/test2/{foo_id}&quot;<br>      body: &quot;*&quot;<br>    };<br>    // This endpoint requires READ access<br>    option (proto.v1.authz) = {permissions: [&quot;read:all&quot;]};<br>  }<br>}</pre><pre>// generatedAuthzMap contains authorization rules extracted from <br>// the proto definitions above.<br>// This map is automatically generated when the `buf generate` command is executed.<br>var generatedAuthzMap = map[string]AuthzRule{<br> &quot;/v1/test/{foo_id}|POST&quot;: {<br>  Permissions:    []string{},<br>  NoAuthRequired: true,<br> },<br> &quot;/v1/test2/{foo_id}|POST&quot;: {<br>  Permissions:    []string{&quot;read:all&quot;},<br>  NoAuthRequired: false,<br> },<br>}<br><br>// example usage<br>if hasPermissions(generatedAuthzMap, reqEndpoint, reqMethod, userPermissions) <br>{ ... }</pre><p>⬆You can see that this generated authorization maps is actually mapping HTTP endpoints instead of gRPC methods. Normally, the gRPC-Gateway works by translating HTTP/JSON → gRPC request and then making a real gRPC network call to the gRPC server. But in our setup, we simplified it, the gateway and gRPC server are running together inside the same process.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/474/1*hbikMoXXTxcknyr5-W-P5g.png" /></figure><p>Instead of sending a new gRPC request over the network, the gateway directly unmarshals JSON into the generated Protobuf models and passes them straight into our service implementation. As a result, no more gRPC transport layer and double marshalling involved.</p><p>Doing this non-recommended way to run gRPC gateway has advantages, but it made us have to use an HTTP middlewares instead of a gRPC interceptor to check our authorization system, which is why the generated authorization map is mapping actual HTTP endpoints and HTTP methods.</p><p>At the end of this article, you will be learning about the protobuf compiler and plugin, navigate through the different go packages that ended up being used by the AI Agent and learn about them, and more.</p><h3><strong>Why this project was perfect for AI-assisted development</strong></h3><h4><strong>Clear requirements, uncertain implementation 💡</strong></h4><p><strong>The Goal:</strong> we needed to automatically generate authorization code from our proto files. Instead of manually maintaining a separate authorization configuration, we wanted our API permissions to live directly alongside the API definitions in our protobuf files ( to really push that single souce of truth idea ).</p><p><strong>The Challenge:</strong> while we knew exactly what we wanted to achieve, some of us didn’t have experience building a protobuf compiler plugin before. Protoc ( protobuf compiler ) plugins are specialized tools that extend the protobuf compiler to generate custom code. In our case, authorization mapping. This involved learning an entirely new ecosystem of APIs, and conventions.</p><p>This is the sweet spot for AI: clear goals combined with technical uncertainty about implementation ideas.</p><h4><strong>Code generating code 🫨</strong></h4><p>Meta-programming (writing programs that write programs) presents cognitive challenges:</p><ul><li>Debugging becomes complex because errors can exist in either the generator or generated code.</li><li>Template-like code generation requires careful escaping and formatting.</li></ul><p>AI assistance is quite helpful there because it catches template formatting issues that are easy to miss and avoid you headaches to some extent 😅.</p><h4><strong>Text parsing complexity 🔎</strong></h4><p>Proto files, while structured, are ultimately text files with complex syntax rules:</p><ul><li>Nested brace structures for method definitions and options</li><li>Multiple comment styles (// and `/* */`) that need to be handled</li><li>Whitespace variations and formatting differences across teams</li><li>etc.</li></ul><p>These parsing challenges involve algorithmic complexity that AI can help iterate through. Rather than spending hours debugging regex patterns or brace-counting logic, AI can help rapidly prototype and refine parsing algorithms.</p><h3><strong>AI-Assisted Discovery: Learning the protogen Framework</strong></h3><p>Rather than spending days reading documentation and examples, we were able to rapidly explore the protogen ecosystem which ultimately allowed us to generate custom Go code from <strong>.proto</strong> files, understand its conventions, and implement our solution iteratively.</p><p>Here’s how we navigated this unfamiliar ecosystem with AI as our guide.</p><h4><strong>Usage of buf.gen.yaml</strong></h4><p><strong>buf.gen.yaml</strong> is a configuration file specifically designed for code generation.</p><pre>version: v2<br>clean: true<br>managed:<br>  enabled: true<br>  disable:<br>    - file_option: go_package_prefix<br>plugins:<br>  // Generate go code from the proto file<br>  - local: [go, tool, google.golang.org/protobuf/cmd/protoc-gen-go]<br>    out: ./gen<br>    opt:<br>      - paths=import<br>  // Generate the authorization map from the custom annotations added in the proto files<br>  - local: [go, run, ./protoc-gen-go-authz]<br>    out: ./gen<br>    opt:<br>      - paths=source_relative<br>    strategy: all</pre><p>It is used by the buf generate command to define how Protobuf plugins are executed to generate code from your proto files.</p><p>When we run buf generate, tools like <strong>buf</strong> or <strong>protoc</strong> read the .proto definitions and produce strongly typed Go code, including message structs, service interfaces, and REST handlers.</p><p>It ensures all team members produce consistent code and makes it easy to add custom plugins like the authorization one we added.</p><h4><strong>How AI helped navigate unfamiliar APIs and conventions:</strong></h4><p>The process is mainly separated into two parts:</p><ul><li>main.go = Orchestrator (High-Level)</li><li>parser.go = Algorithm Worker (Low-Level Details)</li></ul><p>The full code is written in this <a href="https://github.com/Aymenworks/golang-proto-authorization-plugin">open-source repository</a>, but to give you an insight:</p><p><strong>main.go never touches regex, file I/O, or string parsing. It just says:</strong></p><blockquote>“Hey parser, give me the authz rules annotations from these proto files”</blockquote><p>parser.go does all the dirty work:</p><blockquote>“I’ll read the files, count braces, strip comments, parse permissions, and hand you clean authzRule structs”</blockquote><p>This is separation of concerns: the orchestrator stays clean and delegates complex parsing logic to a specialized component.</p><p>So let’s dive into what exactly those files are doing.</p><h3><strong>The main orchestration file</strong></h3><p>The <a href="https://github.com/Aymenworks/public-medium-protocgen/blob/main/protoc-gen-go-authz/main.go">main.go</a> file serves as the high-level orchestrator for our plugin. Its primary responsibility is to interface with the protobuf compiler ecosystem and delegate the complex parsing work to specialized components. Here’s how it’s structured:</p><h4><strong>1. Plugin Initialization and Feature Support</strong></h4><p>First, we set up the plugin capabilities and entry point:</p><pre>package main<br><br>import (<br>    &quot;google.golang.org/protobuf/compiler/protogen&quot;<br>    &quot;google.golang.org/protobuf/types/pluginpb&quot;<br>)<br><br>func main() {<br>    // Tell protoc what features our plugin supports<br>    plugin.SupportedFeatures = uint64(pluginpb.CodeGeneratorResponse_FEATURE_PROTO3_OPTIONAL)<br><br>    // Start the plugin<br>    protogen.Options{}.Run(func(plugin *protogen.Plugin) error {<br>        return generateAuthzCode(plugin)<br>    })</pre><p>The SupportedFeatures declaration is important. Without this, protoc assumes we’re a legacy plugin and might not send us newer protobuf features or generate warnings in some cases.</p><h4><strong>2. The Core Generation Loop</strong></h4><p>The heart of our orchestrator iterates through proto files and delegates parsing:</p><pre>func generateAuthzCode(plugin *protogen.Plugin) error {<br>  // Create output file<br>  filename := &quot;authzmap/generated_authz_map.go&quot;<br>  gen := plugin.NewGeneratedFile(filename, &quot;github.com/aymenworks/public-medium-protocgen/gen/authzmap&quot;)<br><br>  parser := newProtoAuthzParser()<br>  var allAuthzRules []authzRule<br><br>  // Process each proto file<br>  for _, file := range plugin.Files {<br>   // Fitler files that shouldn&#39;t be generated ( value will be false )<br>   if !file.Generate {<br>    continue<br>   }<br><br>   rules := parser.parseFile(file)<br>   allAuthzRules = append(allAuthzRules, rules...)<br>  }<br><br>  // Always generate the authz map file, even if empty<br>  // This ensures the package exists for imports<br>  generateAuthzMapFile(plugin, allAuthzRules)</pre><p>Notice how <strong>main.go</strong> never touches regex, file I/O, or string parsing. It stays at the orchestration level.</p><h4><strong>3. Code Generation with protogen’s Magic</strong></h4><p>The <strong>gen.P()</strong> method is where the magic happens. It’s generating our file content that will contain our authorization map, utility functions, struct, etc.</p><p>It looks simple but handles complex plumbing:</p><pre>func generateGoCode(gen *protogen.GeneratedFile, rules []authzRule) error {<br>    // Generate package header<br>    gen.P(&quot;// Code generated by protoc-gen-go-authz. DO NOT EDIT.&quot;)<br>    gen.P(&quot;package authzmap&quot;)<br>    gen.P()<br>    gen.P(&quot;import \&quot;strings\&quot;&quot;)<br>    gen.P()<br><br>    // Generate struct definition<br>    gen.P(&quot;type AuthzRule struct {&quot;)<br>    gen.P(&quot;    Permissions    []string&quot;)<br>    gen.P(&quot;    NoAuthRequired bool&quot;)<br>    gen.P(&quot;}&quot;)<br>    gen.P()<br><br>    // Generate the authorization map<br>    gen.P(&quot;var generatedAuthzMap = map[string]AuthzRule{&quot;)<br>    for _, rule := range rules {<br>        gen.P(&quot;    \&quot;&quot;, rule.endpoint, &quot;|&quot;, rule.method, &quot;\&quot;: {&quot;)<br>        gen.P(&quot;        Permissions:    []string{&quot;, formatPermissions(rule.permissions), &quot;},&quot;)<br>        gen.P(&quot;        NoAuthRequired: &quot;, rule.noAuthRequired, &quot;,&quot;)<br>        gen.P(&quot;    },&quot;)<br>    }<br>    gen.P(&quot;}&quot;)<br><br>    return nil<br>}</pre><h4><strong>Instead of using protogen.GeneratedFile and </strong><strong>gen.P(), Why not just use os.Create() and write files directly?</strong></h4><p>The <strong>protogen.GeneratedFile</strong> and <strong>gen.P()</strong> pattern exists because:</p><p><a href="https://pkg.go.dev/google.golang.org/protobuf/types/pluginpb"><strong>Protocol compliance</strong></a>:</p><ol><li>protoc compiles the .proto files into a structured representation and packages this information into a <a href="https://github.com/protocolbuffers/protobuf-go/blob/v1.36.10/compiler/protogen/protogen.go#L69">CodeGeneratorRequest</a>.</li></ol><pre>// https://github.com/protocolbuffers/protobuf-go/blob/v1.36.10/compiler/protogen/protogen.go#L69<br>func run(opts Options, f func(*Plugin) error) error {<br> if len(os.Args) &gt; 1 {<br>  return fmt.Errorf(&quot;unknown argument %q (this program should be run by protoc, not directly)&quot;, os.Args[1])<br> }<br> in, err := io.ReadAll(os.Stdin)<br> if err != nil {<br>  return err<br> }<br> req := &amp;pluginpb.CodeGeneratorRequest{}<br> if err := proto.Unmarshal(in, req); err != nil {<br>  return err<br> }</pre><p>2. Our protoc plugin receives this request, processes the descriptions and annotions, generates the desired code ( whatever we want to do with our plugin ), and then packages the results into a <a href="https://github.com/protocolbuffers/protobuf-go/blob/v1.36.10/compiler/protogen/protogen.go#L412">CodeGeneratorResponse</a>.</p><pre>// https://github.com/protocolbuffers/protobuf-go/blob/v1.36.10/compiler/protogen/protogen.go#L69<br>func run(opts Options, f func(*Plugin) error) error {<br>  ...<br>   resp := gen.Response() // CodeGeneratorResponse<br>   out, err := proto.Marshal(resp)<br>   if err != nil {<br>    return err<br>   }<br>   if _, err := os.Stdout.Write(out); err != nil {<br>    return err<br>   }<br>}</pre><p>3. protoc receives the CodeGeneratorResponse and handles the final output, such as writing the generated code to files.</p><p>protogen.GeneratedFile and gen.P() are specifically designed to build this response, abstracting away the low-level details of interacting with protoc which is why it’s pretty convenient.</p><p><strong>Simplified Code Generation:</strong></p><p>gen.P() provides a convenient way to write lines of code to the generated file. It handles <a href="https://github.com/protocolbuffers/protobuf-go/blob/v1.36.10/compiler/protogen/protogen.go#L1117">indentation</a>, <a href="https://github.com/protocolbuffers/protobuf-go/blob/v1.36.10/compiler/protogen/protogen.go#L1122">line breaks</a>, and other formatting aspects, making the plugin code cleaner and easier to read compared to manually concatenating strings and adding newlines.</p><h4><strong>4. What AI Helped Us Learn</strong></h4><p>Working with an unfamiliar framework like protogen, AI assistance was invaluable for understanding:</p><ul><li>The stdin/stdout communication protocol between protoc and plugins</li><li>Why plugin.SupportedFeatures matters for modern protobuf features</li><li>Which of theplugin.Files have file.Generate = true ( only the PRJ files for our config )</li><li>The buffering strategy behind GeneratedFile which will not write into the file until the plugin finished executing.</li></ul><h3><strong>The parser file</strong></h3><p>While <strong>main.go</strong> handles orchestration,<strong> parser.go</strong> does the heavy lifting of extracting authorization information from proto files. This is where AI assistance truly shined, helping us navigate complex text parsing challenges that would have taken days to debug manually.</p><p><a href="https://github.com/Aymenworks/golang-proto-authorization-plugin/blob/main/protoc-gen-go-authz/parser.go#L102">golang-proto-authorization-plugin/protoc-gen-go-authz/parser.go at main · Aymenworks/golang-proto-authorization-plugin</a></p><h4><strong>The Challenge: Parsing Proto Files as Text</strong></h4><p>Here’s what we needed to extract from our proto files:</p><pre>service TestService {<br>  rpc TestWithPermissions(TestWithPermissionsRequest) returns (TestWithPermissionsResponse) {<br>    option (google.api.http) = {<br>      post: &quot;/v1/test2/{foo_id}&quot;<br>      body: &quot;*&quot;<br>    };<br>    option (proto.v1.authz) = {<br>      permissions: [&quot;read:all&quot;]<br>    };<br>  }<br>}</pre><p>From this, we need to extract:</p><ul><li>RPC method name: TestWithPermissions</li><li>HTTP path: /v1/test2/{foo_id}</li><li>HTTP method: POST</li><li>Permissions: [&quot;read:all&quot;]</li><li>Whether auth is required: false (since permissions are specified)</li></ul><p><strong>1. Parser Structure and File Processing</strong></p><p>The parser is organized as a struct with methods for processing different levels of the proto file:</p><pre>// protoAuthzParser handles parsing of authz options from proto files.<br>type protoAuthzParser struct {<br>    authzExtensionNumber protoreflect.FieldNumber<br>}<br><br>func newProtoAuthzParser() *protoAuthzParser {<br>    return &amp;protoAuthzParser{<br>        authzExtensionNumber: 50001, // proto.v1.authz extension number from option.proto<br>    }<br>}<br><br>// parseFile extracts all authz rules from a proto file.<br>func (p *protoAuthzParser) parseFile(file *protogen.File) []authzRule {<br>    rules := make([]authzRule, 0, len(file.Services))<br><br>    for _, service := range file.Services {<br>        serviceRules := p.parseService(service)<br>        rules = append(rules, serviceRules...)<br>    }<br><br>    return rules<br>}<br><br>// parseService extracts authz rules from all methods in a service.<br>func (p *protoAuthzParser) parseService(service *protogen.Service) []authzRule {<br>    rules := make([]authzRule, 0, len(service.Methods))<br><br>    for _, method := range service.Methods {<br>        rule, err := p.parseMethod(method)<br>        if err != nil {<br>            // Skip methods without authz options - this is normal<br>            continue<br>        }<br>        rules = append(rules, rule)<br>    }<br><br>    return rules<br>}</pre><p>You can see that there are ready-to-use utilities methods from the protogen library such as file.Services and service.Methods giving us access to the list of services and methods within the proto files.</p><p><strong>2. Method Parsing and File-based Extraction</strong></p><p>The core method parsing delegates to file-based text parsing when protobuf descriptors don’t contain our custom extensions:</p><pre>// parseMethod extracts authz rule from a single method.<br>func (p *protoAuthzParser) parseMethod(method *protogen.Method) (authzRule, error) {<br>    // Extract authz permissions and no_auth_required flag<br>    permissions, noAuthRequired, err := p.extractAuthzOptions(method)<br>    if err != nil {<br>        return authzRule{}, fmt.Errorf(&quot;failed to extract authz options: %w&quot;, err)<br>    }<br><br>    // Extract HTTP information<br>    httpPath, httpMethod, err := p.extractHTTPInfo(method)<br>    if err != nil {<br>        return authzRule{}, fmt.Errorf(&quot;failed to extract HTTP info: %w&quot;, err)<br>    }<br><br>    return authzRule{<br>        HTTPPath:       httpPath,<br>        HTTPMethod:     httpMethod,<br>        Permissions:    permissions,<br>        NoAuthRequired: noAuthRequired,<br>    }, nil<br>}<br><br>// extractFromProtoSource extracts permissions and no_auth_required by examining the proto source.<br>func (p *protoAuthzParser) extractFromProtoSource(method *protogen.Method) ([]string, bool, error) {<br>    // Get the proto file path and read it<br>    protoPath := method.Desc.ParentFile().Path()<br>    methodName := string(method.Desc.Name())<br><br>    // Extract from the proto file content for any service/method<br>    return p.extractAuthzFromProtoFile(protoPath, methodName)<br>}</pre><p><strong>3. HTTP Information Extraction using Protobuf Extensions</strong></p><p>For HTTP path and method extraction, we use protobuf’s built-in extension system:</p><pre>service TestService {<br>  rpc TestWithPermissions(TestWithPermissionsRequest) returns (TestWithPermissionsResponse) {<br>    option (google.api.http) = {<br>      post: &quot;/v1/test2/{foo_id}&quot;<br>      body: &quot;*&quot;<br>    };<br>    option (proto.v1.authz) = {<br>      permissions: [&quot;read:all&quot;]<br>    };<br>  }<br>}</pre><pre>// extractHTTPInfo extracts HTTP path and method from google.api.http annotation.<br>func (p *protoAuthzParser) extractHTTPInfo(method *protogen.Method) (string, string, error) {<br>    methodOpts := method.Desc.Options().(*descriptorpb.MethodOptions)<br><br>    // Check if google.api.http extension exists<br>    if proto.HasExtension(methodOpts, annotations.E_Http) {<br>        httpRule := proto.GetExtension(methodOpts, annotations.E_Http)<br>        if httpRule != nil {<br>            return p.extractHTTPInfoFromRule(httpRule)<br>        }<br>    }<br><br>    return &quot;&quot;, &quot;&quot;, fmt.Errorf(&quot;no HTTP annotation found&quot;)<br>}</pre><p>method.Desc.Options().(*descriptorpb.MethodOptions) and proto.GetExtension(methodOpts, annotations.E_Http) will get the protobuf method options and check for HTTP extension annotations (annotations.E_Http is the protobuf extension field<br> number for google.api.http )</p><p>From there, we can already get access topost:&quot;/v1/test2/{foo_id}&quot; body:&quot;*&quot;</p><pre>// extractHTTPInfoFromRule extracts path and method from HTTP rule.<br>func (p *protoAuthzParser) extractHTTPInfoFromRule(httpRule any) (string, string, error) {<br> // The HTTP rule should be a message containing HTTP info<br> msg, ok := httpRule.(protoreflect.ProtoMessage)<br> if !ok {<br>  return &quot;&quot;, &quot;&quot;, fmt.Errorf(&quot;HTTP rule is not a proto message&quot;)<br> }<br><br> reflectMsg := msg.ProtoReflect()<br> fields := reflectMsg.Descriptor().Fields()<br><br> // Check for different HTTP methods (get, post, put, delete, patch)<br> for i := range fields.Len() {<br>  field := fields.Get(i)<br>  log.Printf(&quot;field: %s\n&quot;, field.Name())<br>  if !reflectMsg.Has(field) {<br>   continue<br>  }<br><br>  switch field.Name() {<br>  case &quot;get&quot;:<br>   path := reflectMsg.Get(field).String()<br>   return path, &quot;GET&quot;, nil<br>  case &quot;post&quot;:<br>   path := reflectMsg.Get(field).String()<br>   return path, &quot;POST&quot;, nil<br>  ....<br>  }<br> }<br><br> return &quot;&quot;, &quot;&quot;, fmt.Errorf(&quot;no HTTP method found in rule&quot;)<br>}</pre><p>The HTTP rule is a protobuf message, so protobuf<br> reflection is used to inspect it with ProtoReflect. In this case, reflection is used to get all the HTTP methods from the annotation as well as the actual field values ( in our case, post )</p><p><strong>4.Brace counting algorithms, regex patterns to handle whitespace variations, etc.</strong></p><p>You can see more details of its usage and specificities in the<a href="https://github.com/Aymenworks/golang-proto-authorization-plugin"> open-source repository code</a>.</p><p><strong>5. Where AI Made the Difference</strong></p><p>Working on this parser, AI assistance was invaluable for:</p><ul><li><strong>The usage of ProtoReflect: </strong>AI helped find out which protobuf fields and methods to use right away</li><li><strong>Regex refinement</strong>: Iteratively improving regex patterns to handle whitespace variations and optional elements</li><li>etc.</li></ul><p>The most challenging aspect was that proto files aren’t just structured data. They’re text files with complex syntax that can vary significantly in formatting. AI helped us build a robust parser that handles proto files.</p><h4><strong>Challenges debugging code that generates code</strong></h4><p>Obviously, if you don’t take the time to understand in detail how the generated code works, you will have a tough time debugging and investigating issues with the generated code.</p><p>So while it’s very convenient for discovery and explanations, it may be worth taking the time to give the generated code a read/catchup.</p><h3><strong>Key Takeaways</strong></h3><p>Building this protobuf authorization plugin taught us valuable lessons about when and how to leverage AI effectively in software development.</p><h4><strong>When AI excels</strong></h4><p>Unfamiliar APIs discovery/catchup, boilerplate code, pattern recognition with complex parsing logic. Without AI assistance, those would have been extremely time-consuming.</p><h4><strong>Where to be careful</strong></h4><p>Although everything ended up being smooth, it was not that easy from the beginning.</p><ul><li>Poor instructions made the AI agent create a plugin that was “dumb”. As an example, the plugin is supposed to read the proto files every time and generate the authorization map for it, but the initial AI creation ended up having hardcoded authorization map code in the plugin directly which prevented it from being dynamic at all</li><li>It didn’t right away handle parameterized path parameters such as /foo/{id}/create and ended up having issues with those. When the algorithm was trying to match /foo/2/create , it wasn’t able to match both requests, even thought they do. So we had to do back-and-forth communicating with the agent to support those cases.</li></ul><p>Wrong communication can lead the AI agent to generate code that you don’t expect or that doesn’t cover edge cases, so doing back-and-forth communicating with the AI agent is recommended to make sure the alignment is clear, edge cases are covered, and ideally have some unit test ready!</p><p>You can find the open-source repository in the link below allowing you to use and experiment with the protoc plugin as you wish</p><ul><li><a href="https://github.com/Aymenworks/golang-proto-authorization-plugin">https://github.com/Aymenworks/golang-proto-authorization-plugin</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=eba661cdad47" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/how-we-built-an-api-authorization-system-with-a-protobuf-plugin-and-ai-assistance-from-proto-file-eba661cdad47">How We Built an API Authorization System with a Protobuf Plugin and AI Assistance from proto file…</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[GopherCon San Diego2023]]></title>
            <link>https://medium.com/eureka-engineering/gophercon-san-diego2023-db57685ea61f?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/db57685ea61f</guid>
            <category><![CDATA[golang]]></category>
            <category><![CDATA[json]]></category>
            <category><![CDATA[telemetry]]></category>
            <category><![CDATA[gophercon]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Wed, 06 Dec 2023 23:03:51 GMT</pubDate>
            <atom:updated>2023-12-18T07:42:05.047Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ixXnktmJd0UJdiQ7lDGWyQ.png" /><figcaption>Illustration from <a href="https://www.gophercon.com/home">https://www.gophercon.com/home</a></figcaption></figure><p>🎅🏻 <em>This post is part of the Eureka Advent Calendar 2023</em> 🎅🏻</p><p>I have been living in Japan for a few years now and except my hometown France, it&#39;s been a long time since I have been outside Japan.</p><p>At the end of September, me and my coworker and friend <a href="https://medium.com/@jimeux">James</a> had the opportunity to travel to San Diego in California to attend the <a href="https://www.gophercon.com/home">GopherCon 2023</a> .</p><p>I want to thank <a href="https://eure.jp/">Eureka</a>, our company, who sponsored our conference trip and gave us the opportunity to meet new people, learn and experience new things.</p><p>Eureka aims to make dating a social norm in Japan. We&#39;re developing one of the top matching application <a href="https://apps.apple.com/jp/app/pairs-%E3%83%9A%E3%82%A2%E3%83%BC%E3%82%BA-%E6%81%8B%E6%B4%BB-%E5%A9%9A%E6%B4%BB%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E3%83%9E%E3%83%83%E3%83%81%E3%83%B3%E3%82%B0%E3%82%A2%E3%83%97%E3%83%AA/id583376064"><strong>Pairs</strong></a><strong> ,</strong> released in October 2012, which is the most used love/marriage matching app in Japan (*1) with over 20 million registered users (*2) in Japan.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*H34at3d63419dCGw.png" /><figcaption>( <em>*1) Cumulative number of registrations since the service started in 2012. <br>(*2) MMD Institute “2022 Matching Service/App Usage Survey” as of September 2022</em></figcaption></figure><p>We are hiring and have <a href="https://career.pairs.lv/">different roles available</a>, so if you&#39;re interested, feel free to send me a message on <a href="https://twitter.com/aymenworks">Twitter</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*tm1YYl6b37uVB1a0bsc4Gg.jpeg" /></figure><p>The conference happened over four days, the first two being workshops focused and the last two being talks focused. There were also many talks from light to deeper ones, tech to non-tech ones.</p><p>Some people may already have written articles about things they learned at the conference, or dive-in into specific talk like my coworker James did about capslock.</p><p><a href="https://medium.com/eureka-engineering/what-are-your-go-dependencies-capable-of-an-introduction-to-capslock-b757833c9847">What are your Go dependencies capable of?—An introduction to capslock</a></p><p>On my side, I will share a bit of side topics that I found interesting for the community, JSON v2 and Telemetry with Go.</p><h3>JSON v2</h3><p>JSON, a text-based data exchange format, is among the top imported package as highlighted by Joe Tsai during the GopherCon.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Iv78_2s4VIw5vgFw3GZcHw.png" /></figure><p>It&#39;s been used for years and over time, desired functionalities and performance limitations were a concern of many people. <br>At the time of writing this article, there are around <a href="https://github.com/golang/go/issues?q=is%3Aissue+is%3Aopen+%22proposal%3A+encoding%2Fjson%22+in%3Atitle+sort%3Areactions-%2B1-desc+">85 proposals</a> for theencoding/jsonpackage.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*02ODLX_yjNK8ltetEoRj4g.png" /></figure><h4>Desired features</h4><p>The most desired missing features tend to be the one that overtime required engineers to come up with some hacky way for a specific issue, whether it’s for convenience reason and/or to avoid writing boilerplate code again and again.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*kuj8YqcRBMlSvkjoRBxOXw.png" /></figure><p>One example of it is being able to annotate custom date/time format directly within the JSON tag in order to avoid coming up with hacky way such as having to write a Time wrapper as below.</p><p><strong>Before using a Time wrapper:</strong></p><pre>package main<br><br>import (<br>   &quot;encoding/json&quot;<br>   &quot;log&quot;<br>   &quot;strings&quot;<br>   &quot;time&quot;<br>)<br><br>type GopherCon struct {<br>   StartDate ConferenceDate `json:&quot;start_date&quot;`<br>   EndDate   ConferenceDate `json:&quot;end_date&quot;`<br>}<br><br>type ConferenceDate time.Time<br><br>func (m *ConferenceDate) UnmarshalJSON(p []byte) error {<br>   t, err := time.Parse(time.DateOnly, strings.Replace(<br>      string(p),<br>      &quot;\&quot;&quot;,<br>      &quot;&quot;,<br>      -1,<br>     )<br>    )<br>   if err != nil {<br>      return err<br>   }<br>   *m = ConferenceDate(t)<br>     return nil<br>}<br><br>func main() {<br>   jsonData := `{&quot;start_date&quot;: &quot;2023-09-25&quot;, &quot;end_date&quot;: &quot;2023-09- 28&quot;}`<br>   var conf GopherCon<br>   if err := json.Unmarshal([]byte(jsonData), &amp;conf); err != nil {<br>      log.Fatal(err)<br>   }<br>}</pre><p><strong>After: Now with v2, you will be able to use a new custom struct tag format:</strong></p><pre>package main<br><br>import (<br>   &quot;log&quot;<br>   &quot;time&quot;<br>  <br>   &quot;github.com/go-json-experiment/json&quot;<br>)<br><br>type GopherCon struct {<br>   StartDate time.Time `json:&quot;start_date,format:&#39;2006-01-02&#39;&quot;`<br>   EndDate   time.Time `json:&quot;end_date,format:&#39;2006-01-02&#39;&quot;`<br>}<br><br>func main() {<br>   jsonData := `{&quot;start_date&quot;: &quot;2023-09-25&quot;, &quot;end_date&quot;: &quot;2023-09-28&quot;}`<br>   var conf GopherCon<br>   if err := json.Unmarshal([]byte(jsonData), &amp;conf); err != nil {<br>      log.Fatal(err)<br>   }<br>}</pre><p>Quite cool and straightforward?</p><p>Another example is how it&#39;s not seamless to be able to omitempty a struct type field without using pointers. In v2, omitempty has been redefined in terms of the JSON type system, rather than the Go type system.</p><p><strong>Before, you needed a pointer on the </strong><strong>*Venueobject below to omit it from the JSON when it was not set.</strong></p><pre>package main<br><br>import (<br>   &quot;encoding/json&quot;<br>   &quot;fmt&quot;<br>   &quot;time&quot;<br>)<br><br>type Venue struct {<br>   Name string `json:&quot;name,omitempty&quot;`<br>}<br>type GopherCon struct {<br>   StartDate time.Time `json:&quot;start_date,format:&#39;2006 -01-02&#39;&quot;`<br>   Venue     *Venue    `json:&quot;venue,omitempty&quot;`<br>}<br><br>func main() {<br>   conf := GopherCon{StartDate: time.Date(2023, 11, 10, 0, 0, 0, 0, time.UTC)}<br>   b, err := json.Marshal(conf)<br>   if err != nil {<br>      fmt.Println(err)<br>   }<br>   fmt.Println(string(b))<br>}<br><br>// result: {&quot;start_date&quot;:&quot;2023 -11-10T00:00:00Z&quot;}</pre><p><strong>With the v2 version, </strong>omitempty will omit the field if the value would have been encoded as an empty JSON value and this is recursively effective<strong>:</strong></p><pre>package main<br><br>import (<br>   &quot;fmt&quot;<br>   &quot;time&quot;<br>  <br>   &quot;github.com/go-json-experiment/json&quot;<br>)<br><br>type Venue struct {<br>   Name string `json:&quot;name,omitempty&quot;`<br>}<br>type GopherCon struct {<br>   StartDate time.Time `json:&quot;start_date,format:&#39;2006-01-02&#39;&quot;`<br>   Venue     Venue     `json:&quot;venue,omitempty&quot;`<br>}<br><br>func main() {<br>   conf := GopherCon{StartDate: time.Date(2023, 11, 10, 0, 0, 0, 0, time.UTC)}<br>   b, err := json.Marshal(conf)<br>   if err != nil {<br>      fmt.Println(err)<br>   }<br>   fmt.Println(string(b))<br>}<br><br>// result: {&quot;start_date&quot;:&quot;2023-11-10&quot;}</pre><p>If you&#39;re interested, you can find more about it in the discussion right <a href="https://github.com/golang/go/discussions/63397">here</a> .</p><h4>Performance</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ci6L3gaVx5dQwLGDJ9Hgug.png" /></figure><p>It&#39;s impressive how Unmarshal got faster, but if you look at the existing Unmarshal implementation, you can see that there is a caveat.</p><pre>func Unmarshal(data []byte, v any) error { <br>  // Check for well-formedness. <br>  // Avoids filling out half a data structure <br>  // before discovering a JSON syntax error. <br>  var d decodeState <br>  err := checkValid(data, &amp;d .scan) <br>  if err != nil { <br>      return err <br>  } <br>  <br>  d.init(data) <br>  return d.unmarshal(v) <br>}</pre><p>You can see that in addition to unmarshal at the end, the JSON data first need to be read entirely to check whether it&#39;s a valid JSON or not through checkValid. So there are two costly operations.</p><p>Even though the existing json cache implementation <a href="https://github.com/golang/go/blob/master/src/encoding/json/decode.go#L646">fields some information</a> and can help process faster the information after its cached once.</p><pre>func (d *decodeState) object(v reflect.Value) error { <br>  ... <br>  var fields structFields <br>  ... <br>  switch v.Kind() { <br>  .... <br>  case reflect.Struct: <br>    fields = cachedTypeFields(t)<br>  ...<br>}</pre><p>It seems that this double parsing for validating the data and finally unmarshalling really makes it heavy anyway.</p><p>If you&#39;re interested in looking more at the performance, you can find benchmarks there: <a href="https://github.com/go-json-experiment/jsonbench">https://github.com/go-json-experiment/jsonbench</a>.</p><h3>Telemetry with Go</h3><h4>What is it?</h4><p>Russ Cox, who, at the time of writing this article leads the development of the Go programming language at Google, made a talk about Go changes and talked to us as well about their proposal to introduce Transparent Telemetry. And started from a <a href="https://github.com/golang/go/discussions/58409">discussion</a>.</p><p>To make it simple, it will allow some Go tools like the compiler, gopls, and govulncheck, to report some data back to Google for analytics in order to privide better, more effective features and fixes to the community.</p><h4>Why do we need it?</h4><p>The main reason is for the Go community to be more happy, obviously. Telemetry would provide different information than user surveys.</p><p>The more analytics data collected by the Go team, the more accurately they can suppose whether a feature they released is used as expected or not.</p><p>Russ was explaining how Go&#39;s new features come from a data-driven process:</p><ul><li>List of proposals in GitHub and everyone discuss it</li><li>Talking to users: go user survey, in-editor survey, research interviews/user experience</li><li>Random Open source code analysis, and regarding the Sampling: the more the better</li></ul><p>But those were not enough. Indeed, most of the users report only when they think something is broken, but sometimes what is actually broken may not seem like a bug for users.</p><p>An example shared was when in 2020, the 1.14 version of Go contained changes for the macOS distribution ended up unwillingly requiring the users to install the Xcode Developer Tools, when actually this was not expected.</p><p>This was not found until late 2022, so two years later, when the Go team found out about it when investigating some stuffs. Users may have thought that this was the expected behavior for MacOS while it actually wasn&#39;t and this shows how bug reporting feature is not always effective.</p><p>So the idea was to put in place a transparent telemetry through Go tools, and by transparent, it means sharing how the data will be collected, what kind of data, the frequency etc.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*67SEVsYP7UC-P8gEWvVoBw.png" /><figcaption><a href="https://research.swtch.com/telemetry-design">https://research.swtch.com/telemetry-design</a></figcaption></figure><p>If you&#39;re interested about the proposed design, you can find more <a href="https://research.swtch.com/telemetry-design">there</a>.</p><h4>Optional feature instead of by default..</h4><p>They initially were planning to enable it by default but that made many people worries about it and ended up going with an opt-out option ( not by default ), meaning that users will actually need to be educated about why this is in their benefits to actually enable it, which is very unfortunate as this doesn&#39;t always guarantee that the Go team will have enough accurate collected data from Go users to validate new proposals efficiency.</p><p>Looking forward for the actual proposed implementation and its roadmap.</p><p>You can read more about the details regarding the opt-out decision <a href="https://research.swtch.com/telemetry-opt-in">there</a>.</p><p>If you go to San Diego and you&#39;re interested in trying good and affordable Mexican foods, <a href="https://maps.app.goo.gl/ovmA4eXHJeTRAwQa6">Lola 55</a> was lovely in the East Village.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*PMkJrn49s8f8KTOyNrcO6g.jpeg" /></figure><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=db57685ea61f" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/gophercon-san-diego2023-db57685ea61f">GopherCon San Diego2023</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[From an iOS engineer to a Backend Engineer]]></title>
            <link>https://medium.com/eureka-engineering/from-an-ios-engineer-to-a-backend-engineer-bc52d2c79af1?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/bc52d2c79af1</guid>
            <category><![CDATA[backend-engineer]]></category>
            <category><![CDATA[ios-engineer]]></category>
            <category><![CDATA[golang]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[job-transition]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Tue, 07 Dec 2021 05:20:39 GMT</pubDate>
            <atom:updated>2021-12-07T05:21:51.035Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*9lhpGxC6jLmVbuPpjopFEQ.png" /></figure><h3>Transition from an iOS engineer to a backend engineer</h3><p>This is the December 7 article for<a href="https://qiita.com/advent-calendar/2021/eureka"> Eureka’s 2021 Advent Calendar</a> <br>Nearly one year ago, I transitioned from an iOS engineer position to a Backend engineer position within the same company, Eureka.</p><p><strong>Hopefully, I can provide you with some useful insights regarding the transition from an iOS role to a Backend engineer role by writing and sharing about how I proceed to do it and the different steps taken to catch up the technical skills.</strong></p><h3>The transition from iOS engineer to backend engineer</h3><p>I joined <a href="https://eure.jp/"><strong>Eureka</strong></a> in June 2018 as an iOS Engineer. It’s a company developing the dating application <a href="https://itunes.apple.com/jp/app/id583376064?mt=8"><strong>Pairs</strong></a>, which help people find their life partners and <a href="https://engage.pairs.lv"><strong>Pairs Engage</strong></a> which is a digital and affordable Marriage Matchmaking Agency helping people to marry as soon as possible.</p><p><strong>Starting with iOS</strong></p><p>As an iOS Engineer working for the <a href="https://engage.pairs.lv"><strong>Pairs Engage</strong></a> product, my contributions and learning were very diverse. Since I joined the project as it just started, I worked with a team of engineers, designer and product owner to design and implement new features from mapping the user journey, user stories, giving UX and UI ideas to developing and shipping the iOS application for new targets/markets.</p><p>On the iOS side, it was an honor to work with the friend and teammate <a href="https://medium.com/u/834daea6eec5">John Estropia</a> leading the iOS development with who I learned a lot.</p><p>I got the opportunity to work on very interesting tech stacks from:</p><ul><li>A iOS project architecture that is <strong>MVVM/Flux</strong> reactive based to provide a single source of truth data-wise</li><li>Making delightful experience through very smooth graphics thanks to libraries such as<strong> Texture/AsyncDisplayKit</strong></li><li>Implementing face detection to help center the user’s uploaded profile picture to make it easier to notice his facial profile,</li><li>Many more..</li></ul><p>We were sometimes doing study-times and pair programming for knowledge sharing, that sometimes resulted in the creation of multiple open source animations challenges available in G<a href="https://github.com/Aymenworks/AnimationsChallenge">ithub</a>.</p><p>If you’re interested to learn more about those animation challenges, you can read the first one right there about the<a href="https://medium.com/eureka-engineering/animations-challenges-1-bear-ios-search-animation-7ea5e4ea0a34"> Bear’s iOS search animation.</a></p><h4>Why changing to Backend</h4><p>I think at some point, I wanted to widen my skillset ( horizontally ). When we create iOS apps, we interact a lot with backend engineers. How data is structured and stored, business logic treated, performances for fetching and manipulating data managed, etc. So I got an interest for this team because I thought that as someone with an iOS experience, I may know what may or may not be convenient for front engineers, etc. And learning how things on the API side are done behind the scene are simply interesting.</p><h3>What’s similar and what’s different</h3><p>When I started working on API tasks, little by little I started seeing similarities or sometimes big differences both in the way of thinking and the programming languages themselves.</p><h4>Programming languages differences</h4><p>Both Swift and Go lang are <strong>statically typed</strong>, meaning that we need to know the type of properties during compile time, whether it’s inferred or set explicitly, which is good for coding safely 😇.</p><pre>// go <br>s := &quot;Hello&quot; // type of `s` is `string`</pre><pre>----</pre><pre>// swift<br>let s = &quot;Hello&quot; // type of `s` is `String`</pre><p>It’s difficult to compare both languages because they are made for different purpose.. Go was apparently made to be scaled across multiple processor which explain <strong>its native support for concurrency </strong>through only one keyword go yourFunction(), <strong>as well as being able to communicate between goroutine </strong>using<strong> channels</strong>, native <strong>support for web servers</strong>, etc..</p><p>In term of language features, in my opinion Swift wins with many features, such as:</p><ul><li><strong>Optional types</strong>, which is Swift-way that help you avoid nil pointer exceptions errors as much as possible.</li></ul><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/1ef454dd59fc023c7771b6df848f8ef0/href">https://medium.com/media/1ef454dd59fc023c7771b6df848f8ef0/href</a></iframe><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/af1a4f87b948f57bf28ce3f6a40d748d/href">https://medium.com/media/af1a4f87b948f57bf28ce3f6a40d748d/href</a></iframe><p>Another example is if you have embed type with nullable properties.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/fb0cc6e9d6556741071b7024aada9fff/href">https://medium.com/media/fb0cc6e9d6556741071b7024aada9fff/href</a></iframe><p>Also, in addition to Optional types:</p><ul><li>Inheritance</li><li>Generics ( <strong>coming soon with go!</strong> )</li><li>The possibility to add custom operators for any types</li><li>Enum cases and the possibility to check at compile-time that your switch-case is not missing any possible values from your enumeration..</li><li>Many more..</li></ul><h4>Different challenges</h4><p>Both platforms have theirs very different challenges</p><p><strong>iOS:</strong></p><p><strong>Device and iOS Compatibility</strong>: supporting old iOS versions to keep as much users as possible, updating the app to resolve the issues that may happen with a new iOS version, fixing layout issues when a new device with a very small or big screen is released, deciding when to stop supporting a specific iOS version, etc.. All those things takes time and work.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vB4mktNd_Utus9HJz7jOPw.jpeg" /></figure><p><strong>App Store rejection:</strong> Releasing is not about just pushing a button, it’s about hopping that the released version will not be rejected on the App Store, that it’s not going to be delayed by days before it’s approved..</p><p><strong>High Expectations of UX/UI, Graphics:</strong> In order to fulfill the user’s need, an important amount of work regarding the experience of an application, its usability, how easy and smooth it is to do something is one of the important challenges that front engineers in general need to deal with.</p><p>And many more..</p><p><strong>Backend:</strong></p><p><strong>Scalability and resilience</strong>. Compared to an iOS app where there is usually only one user logged in, servers, depending on the company, have to support a load of millions/hundred of thousands/dozens of thousands/thousands users connected and using the API services at the same time.</p><p>So I would say in general, if you can delegate heavy work from your server to someone else, that would be beneficial for your server CPU/memory consumption.</p><p>For example, if the server is using Amazon Web Services (AWS) S3 to upload images in buckets and in the case where it’s common for the client to upload heavy pictures on your server, then instead of asking the iOS client to send you a heavy base64 picture or through a multipart request, in which situation both the client and the server have to deal with a considerable amount of resources/memory usage, AWS S3 come up with a nice feature called <a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html"><strong>Pre-signed URL</strong></a> that will allow the client to upload the image directly at a specific URL, thus delegating the work from the server to AWS.</p><p>Another example: if one of your database for example is down, having a way to quickly resolve the situation is important. It’s common to have replication of database ( multiple “slave” databases ) that will all have copy of the main (master) database and will always be synchronized ( with a little delay ). Every time a request to fetch data is executed, instead of asking the master database, one of the slave database will be used instead, thus dividing the charge of work.</p><p><strong>Infrastructures costs. </strong>Since a tons of users may be logged in and using the API services, making sure in general to choose the most optimized solution for an implementation is important as the cost/resource consumption increase proportionally to the number of users using the service. So if you’re implementing for example a login frequency implementation, then make sure for example to compare the pro/cons between different services in term of costs and server resource consumption depending on your needs e.g data has a lifespan, etc. ( MySQL? Amazon AWS DynamoDB? other? )</p><p>And many more..</p><h4>How did the transition from an iOS Engineer to a Backend Engineer happened within my company Eureka</h4><ol><li><strong>Sharing with the manager</strong></li></ol><p>I took the first step by sharing what I wanted to do with my manager. I wanted to have his opinion. A few months later, after meeting with the manager of the Backend team, we agreed on the start date.</p><p>When I started, I was exchanging closely with the CTO <a href="https://medium.com/u/dfdec54860fb">kaneshin</a> and from there a friend and lead backend engineer <a href="https://medium.com/u/5515517abe5">James Kirk</a> took me under his arms and mentored me all the way till now.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-QoZ8UjL8_6Xwr6oe6dJ5Q.png" /><figcaption><a href="https://roadmap.sh/backend">https://roadmap.sh/backend</a></figcaption></figure><p>From this graph which kinda summarize some of the important skillsets to have as a Backend engineer, we took it into account as well as the existing tech stacks of our Pairs Engage API project, and we started working on a side project first while being mentored.</p><p><strong>2. Started working on a side project first while being mentored</strong></p><p>In order to catch up with the minimum required skills to start being productive as soon as possible, working on projects/small tasks and learning fast was necessary.</p><p><a href="https://medium.com/u/5515517abe5">James Kirk</a> invited me in a weekly 1~2h pair programming meeting that we had and kept for most of the year ( lucky me! )</p><p><strong>a) We first looked at the basics or base setup of a server app</strong>, meaning what are the things that need to be understood and implemented as soon as someone start a new project.</p><p>In a different and new project made specially for experimenting and learning, after having done an <a href="https://go.dev/tour/welcome/">interactive introduction to the Go language</a>, we covered some basics things like:</p><ul><li><strong>How environment variable are used within the project for local development</strong>: the fact that with environment variable, your project configuration including sensitive data ( API keys, tokens, etc. ) are not stored in your repo but locally in a env/envrc file</li><li><strong>Go mod</strong> or how to manage dependencies within a Go Lang project</li><li><strong>The life of an HTTP request</strong> in a Go server where we took a look at handlers, middleware, etc.</li><li><strong>Panic handling</strong> or how to recover when something bad happens in a request that could crash the server, such as an unhandled fatal error.</li><li>How to implement a<strong> Graceful Shutdown</strong> through Go’s Signal and Channel mechanisms in order to safely and properly shutdown resources/tasks/connections when the server needs to be turned off.</li><li>And many more..</li></ul><p><strong>b) We then moved to the next step that covered:</strong></p><ul><li><strong>Clean Architecture</strong> whose ideas/patterns are used within the project. Hands on on implementing a constraint-based architecture where very layer should be independent and testable as much as possible that would allow for example to <strong>easily replace any frameworks that we are using, replacing any databases, etc. — -&gt; see more below about it</strong></li><li><strong>The use of Docker and </strong><a href="https://github.com/localstack/localstack"><strong>Localstack</strong></a>. Playing with AWS services freely could sometimes be dangerous cost-wise 😅, and since not all of the services have an easy way to work locally on our computer, <a href="https://github.com/localstack/localstack">Localstack</a> was used in the side project in order to be able to interact with different AWS services such as AWS S3, DynamoDB, etc.</li></ul><p><a href="https://github.com/localstack/localstack">GitHub - localstack/localstack: 💻 A fully functional local AWS cloud stack. Develop and test your cloud &amp; Serverless apps offline</a></p><ul><li>Learning on my side through Udemy classes..</li></ul><p><strong>c) From that, I then worked on small tasks that includes bug fixes, code improvement etc in order to get familiar with the project’s code.</strong></p><p>As the tasks got bigger and bigger, and I was getting more and more familiar with the project’s code thanks to my team PR reviews and feedbacks and coaching, I started taking in charge bigger and bigger features that were rolled out in production to our users. Again, those tasks were really diverses. Those tasks included for example:</p><ul><li><strong>Reducing the charge of our server </strong>for multiple image uploading by usin<strong>g Pre-signed AWS S3 URL</strong>s, which delegated the role to upload the image from the server to the client.</li><li><strong>Designing DynamoDB tabl</strong>e to implement a login frequency feature ( last 10 days )</li><li><strong>Resolving N+1 problems</strong></li><li>etc..</li></ul><p><strong>Hopefully, those insights regarding how I proceed to transition and the different steps taken to catch up the skills and learned were useful for you.</strong></p><p>So from those info, don’t hesitate to go talk to your manager if you want to try, and try to find a mentor that can help guide you at the beginning, that would be very helpful!</p><p>I would like to thanks my company Eureka for being very encouraging regarding this transition and their support.</p><p>We’re hiring in Japan. You can find the list of jobs <a href="https://eure.jp/careers/">there</a>.</p><p><a href="https://eure.jp/careers/">Careers - Eureka, Inc.</a></p><p>If you’re interested, don’t hesitate to send me a private message on my twitter account: <a href="https://twitter.com/aymenworks"><strong>aymenworks</strong></a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=bc52d2c79af1" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/from-an-ios-engineer-to-a-backend-engineer-bc52d2c79af1">From an iOS engineer to a Backend Engineer</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Animations challenges #3 — Zenly Pops Animation]]></title>
            <link>https://medium.com/eureka-engineering/animations-challenges-3-zenly-pops-animation-5810c7ea23a9?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/5810c7ea23a9</guid>
            <category><![CDATA[zenly]]></category>
            <category><![CDATA[ux]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[ux-design]]></category>
            <category><![CDATA[animation]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Thu, 03 Dec 2020 02:24:02 GMT</pubDate>
            <atom:updated>2020-12-04T07:22:14.558Z</atom:updated>
            <content:encoded><![CDATA[<h3>Animations challenges #3 — Zenly Pops iOS Animation</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vm5cpw4h5kn8nQcBgNKy_Q.png" /></figure><blockquote>Animations challenges is actually pretty fun. It’s about picking a random animation in one of the app that we use everyday ( twitter, Facebook, Slack, others .. ) and try to recreate it together by doing Pair Programming.</blockquote><p>December 12 was the first day for <a href="http://twitter.com/aymenworks">me</a> or <a href="http://twitter.com/johnestropia">John</a>, both iOS Engineer at <a href="https://eure.jp">Eureka</a>, for our Animations Challenges.</p><p>Animations are very funny. When you look at them, they look very simple, but when you try to look closer, you will notice that it involves many changes/sub-animations under the hood, making the end result amazing and almost unnoticed.</p><p>So the goal of those animations challenges is to try to analyze those animations and try to replicate them. After that, there will be a blog article showing how we approached this challenge.</p><p><strong>For this animation, I had to work on it by myself for the Advent Calendar event. John will join us for another challenge, hopefully!</strong></p><p>This post is a part of the Advent calendar organized of Eureka, following <a href="https://medium.com/eureka-engineering/ios%E3%81%AB%E3%81%8A%E3%81%91%E3%82%8B%E3%83%84%E3%83%BC%E3%83%AB%E3%83%81%E3%83%83%E3%83%97%E3%81%AE%E5%AE%9F%E8%A3%85-ce1426f3fcc2">the second day article written by Shima in Japanese regarding the implementation of the tooltip on iOS.</a></p><h3>Introduction</h3><p>The third animation I decided to reproduce was the animation from the <a href="https://community.zen.ly/hc/en-us/articles/360005119977-Pops-">Zenly Pops animation</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/240/1*z-vwl09NmnDLemQNcdEfFg.gif" /><figcaption><a href="https://community.zen.ly/hc/en-us/articles/360005119977-Pops-">https://community.zen.ly/hc/en-us/articles/360005119977-Pops-</a></figcaption></figure><h3>Let’s dive in</h3><h4>Initial state:</h4><p>I created a quick design to start from. It has a label, background, and a white view.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/282/1*EWseu8vZlfjefkxulZMjlg.png" /></figure><h4>Observation #1: Faces are falling from the sky!</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/240/1*XzOkUljW1rtBqH0vZ3ULig.gif" /><figcaption>Faces are falling from the sky</figcaption></figure><p>One of the first observation we can make is that we can see faces falling from top to the bottom.</p><ul><li><strong>They fall one by one</strong></li><li><strong>They fall from different </strong><strong>x position</strong></li><li><strong>The way they fall down feels very familiar. It falls faster over time. Similar to taking an orange and letting it fall from a table.</strong></li></ul><p>While creating such<strong> “falling” </strong>animations may be possible using the traditional way to make animation with UIKit and custom curve animations, those animations can easily be made with UIKitDynamics.</p><p>UIKitDynamics have a series of classes we can use to communicate to a user how something feels by trying to reproduce real-life physics.</p><p>We could divide UIKitDynamics into two:</p><ul><li><strong>Behavior classes</strong>: those classes are specific dynamic behavior such as gravity, collision, etc.</li><li><strong>UIDynamicAnimator</strong>: the class that will allow us to manage our behaviors above.</li></ul><p>To reproduce the falling effect, the behavior we will be first using is the <strong>UIGravityBehavior: </strong>gravity is an invisible force of attractions that exists between two items. Earth’s gravity is what pulls us on the ground and that’s why items are falling down.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/b8d97d438fa3b3fd0bcc84f054de8e5d/href">https://medium.com/media/b8d97d438fa3b3fd0bcc84f054de8e5d/href</a></iframe><ul><li><strong>1:</strong> the class that will allow us to manage our behaviors aka specific dynamic behavior</li><li><strong>2: </strong>We delay the creation of the face view to display them “one by one”</li><li><strong>3: </strong>Faces will have a random size</li><li><strong>4:</strong> <strong>x</strong>: We position the face randomly in the x-axis, so that faces can fall down from different position, <strong>y</strong>: negative so that they are falling from above</li><li><strong>5: </strong>We add a gravity behavior for each of our face</li><li><strong>6:</strong> We ask our “manager” to add the gravity behavior, which will automatically trigger the animation.</li></ul><p>Which produce this result:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/272/1*EFqjiNPxakiJDgcuAHadJQ.gif" /><figcaption>Gravity behavior and face falling</figcaption></figure><h4>Observation #2: Collisions everywhere!</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/240/1*z-vwl09NmnDLemQNcdEfFg.gif" /><figcaption>Collisions</figcaption></figure><p>We can notice that there are not only one type of collisions, but multiples ones:</p><ul><li>Between the faces and the bottom white view</li><li>Between the faces themselves</li><li>Between the faces and the borders of the screen.</li></ul><p>The behavior we will be for creating collision-like effects is by using the <strong>UICollisionBehavior</strong>.</p><p>Similarly to the UIGravityBehavior, we will create an instance of it, add a collision behavior for each of our faces and then tell our animator manager to add it.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/dd4f006fd8fdde5b59258cb007aa3932/href">https://medium.com/media/dd4f006fd8fdde5b59258cb007aa3932/href</a></iframe><p>If you run the code…</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/272/1*EFqjiNPxakiJDgcuAHadJQ.gif" /></figure><p>…nothing will happen 😅</p><p>The reason is that although we made our faces views to have a collision behavior, we didn’t say anything about the rest: the white bottom view, the left edge or even the right edge. Our faces views are not aware of any “walls” or “barriers”, so we need to communicate that.</p><p>One way of doing that is by manually adding an invisible <strong>“boundary” </strong>that will act as an<strong> invisible wall</strong>. We do that by calling the <strong>addBoundary</strong> method from the collision behavior and we indicate the path of our white bottom view.</p><p>So since we are using the collisionBehavior property inside a different method, we need to move its declaration from inside the viewDidLoad to outside the viewDidLoad.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/ea649e96232142164891dd6253d9c622/href">https://medium.com/media/ea649e96232142164891dd6253d9c622/href</a></iframe><p>And the result is:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/285/1*IqsMkgAx5MB9UvYgvY_xpw.gif" /></figure><p>The white bottom view is now acting like a wall ✅<br>But wait.. Why are our faces escaping to the left and the right edges now??</p><p>Well, similarly to what we did to our white bottom view, we need to create invisible walls for the left and right edges so that our faces can stay within the area.</p><p>Since the edges positions are quite straightforward, we can directly add them in the viewDidLoad:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/87593cc89369c8964d17e8adeec8231d/href">https://medium.com/media/87593cc89369c8964d17e8adeec8231d/href</a></iframe><p>If we run, the result is:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/286/1*nvznKXq5noHxNYVgilFuEw.gif" /></figure><p>Well… It’s quite bugged 😅 🙄</p><p>My guess is that it’s because we didn’t really set physic-related properties in the code such as the friction between the views. We will see that in the next section.</p><h4>Observation #4: Polishing physics</h4><p>Let’s take a look again at the original animation:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/240/1*z-vwl09NmnDLemQNcdEfFg.gif" /></figure><ul><li>One thing we can notice is that the faces views don’t have a big <strong>elasticity</strong> or even not at all. It’s not as bouncy as a ping-pong ball or a basket ball.</li><li>Another thing we can notice is the <strong>friction</strong>. Friction is a force that opposes the sliding motion when two object rub each other. When you’re skying, your ski and the snow are in friction. You are still moving fast on top of the snow because the friction is low, which let the motion happens fast. So another observation in this Zenly animation is that it doesn’t look like there is a big friction between the faces views. Face views are almost sliding when they are rubbing into each other.</li></ul><p>So let’s try to apply those observations by using a new behavior class, called <strong>UIDynamicItemBehavior</strong>. This class allows us to customize some real-like physics properties such as elasticity, friction, etc.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/1365ffba22496491030efd85dd93700b/href">https://medium.com/media/1365ffba22496491030efd85dd93700b/href</a></iframe><p>And the result is:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/400/1*b1qUjJrLD_LYE_hgJMb8qA.gif" /></figure><p>It’s better! 🤓</p><h4>Observation #5: Pan gesture to move items and create chaos!</h4><p>If you wait till the end of the gif animation, you will notice that items are moving and pushing each other near the end:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/240/1*z-vwl09NmnDLemQNcdEfFg.gif" /><figcaption>Collisions</figcaption></figure><p>The way I approached this behavior is by adding a pan gesture that will push the face view toward the direction of the pan!</p><p>By applying the pan gesture to the face view with my finger, I will use a dynamic behavior to translate/push the face view following the velocity of the gesture.</p><p>To do that,</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/4c218d8759f3d3959a61ae616d79d2f4/href">https://medium.com/media/4c218d8759f3d3959a61ae616d79d2f4/href</a></iframe><p>and the result as we apply pan movements:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/400/1*5917KaD2zk_m1lI4_fjg9w.gif" /></figure><h4>✅ Good job for reading and reaching the end 💪</h4><h4>You can now reproduce a “similar” animation as the Zenly Pops animation!</h4><p>You learned that sometimes, something that looks like a challenge, once putting enough through on it, can be achieved without struggling too much.</p><p>This is the third animation challenge blog article I write, and there is going to be more! So if you have any feedbacks, let me know :).</p><p>You can find the full animation project there:</p><p><a href="https://github.com/Aymenworks/AnimationsChallenge/tree/master/ScrollToSearch">Aymenworks/AnimationsChallenge</a></p><p>If any questions, you can reach <a href="http://twitter.com/aymenworks">me</a> on twitter anytime :).</p><p>You can already find the first and second animation challenge right there:</p><ul><li><a href="https://medium.com/eureka-engineering/animations-challenges-1-bear-ios-search-animation-7ea5e4ea0a34">Animations challenges #1 — Bear iOS Search animation</a></li><li><a href="https://medium.com/eureka-engineering/animations-challenges-2-asana-loader-animation-c3a6d040f358">Animations challenges #2 — Asana Loader animation</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=5810c7ea23a9" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/animations-challenges-3-zenly-pops-animation-5810c7ea23a9">Animations challenges #3 — Zenly Pops Animation</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Psychology in Design]]></title>
            <link>https://medium.com/eureka-engineering/psychology-in-design-fcd6c8ce7a0b?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/fcd6c8ce7a0b</guid>
            <category><![CDATA[laws-of-ux]]></category>
            <category><![CDATA[ux]]></category>
            <category><![CDATA[design]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Mon, 09 Dec 2019 00:27:29 GMT</pubDate>
            <atom:updated>2019-12-09T00:31:13.484Z</atom:updated>
            <content:encoded><![CDATA[<p>This article is for the 9th date of the Eureka Advent Calendar.</p><h3>Who I am?</h3><p>I am Aymen and I am from Marseille, France, working in Tokyo, Japan.</p><p>I joined <a href="https://eure.jp/"><strong>Eureka</strong></a> in June 2018 as an iOS Engineer. It’s a company developing the dating application <a href="https://itunes.apple.com/jp/app/id583376064?mt=8"><strong>Pairs</strong></a>, which help people find their life partners 🌸 and <a href="https://engage.pairs.lv"><strong>Pairs Engage</strong></a> which is a digital and affordable Marriage Matchmaking Agency helping people to marry as soon as possible.</p><h3>Why I am talking about this topic</h3><p>One of my goal recently was to be able to help my team with not only my iOS development skills, but also helping with UI/UX if possible.</p><p>So I tried to think about an interesting issue we found out that was happening in our product and that I could try to resolve: Some users think that Pairs Engage is like an Online Dating app ( but it’s not, it’s a digital Marriage Matchmaking service )</p><p>So I researched about how to change people image/vision about a product and make users feels like this is a digital Marriage Matchmaking service and not only an Online Dating service.</p><p>And while I was doing research about how to do that, I found an interesting website called Laws Of UX.</p><h3>Laws of UX</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/274/1*N-F0X_FoYq4LMTj3_va4LA.png" /><figcaption><a href="http://lawsofux.com"><strong>http://lawsofux.com</strong></a></figcaption></figure><p>Jon Yablonski created a list of what he calls UX Laws about User Experience and he believes that by knowing them, you will be able to create more intuitive design.</p><p><strong>Why?</strong> Because those rules are based on general scientific understanding on how the mind works and so focus on the psychology part of design</p><p>So I want to dive in through some of those UX Laws, understand them, and see how it can happens with both physical and digital products.</p><h4>Aesthetically pleasing effect Law</h4><p>Did you ever wonder why cheap wine tastes better in fancy glasses? I did. And that’s kind of what is it about.</p><p>Researchers Masaaki Kurosu and Kaori Kashimura from the Hitachi Design Center tested 26 variations of an ATM UI and found out that users are strongly influenced by the look and feel of interfaces.</p><p>Basically, It means that nice design can make users more tolerant of minor usability issues and prevent issues from being discovered during usability testing.</p><p>To have a better understanding of this, if we take a look at the screenshot below:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*SwD4LDWa4RnuMtY8-es_UA.png" /><figcaption>Source: Pinterest images</figcaption></figure><p>That would mean that most of the users preferred the screen at the left over the one at the right because of the “better UI”, its look and feel.</p><p>When looking more closely, we can see that actually the screen at the left has some issues, such as not being able to correct, erase nor validate manually the pin code like we find in nowadays ATM UI ( right screen ).</p><h4>Peak End Rule Law</h4><p>The Peak End Rule says that we don’t remember experiences accurately.<br>Rather, we tend to recall the highlights and how the experience end ( good or bad end )</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*MrwTaDrAiI4DLUgBc1Au1w.png" /><figcaption>source: nngroup.com</figcaption></figure><p>A good example is people waiting in a queue for an attraction:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*txWdQq06eh-6giMJXMUKzA.png" /><figcaption>Source: <a href="https://www.theladders.com/career-advice/peak-end-rule-why-you-make-terrible-life-choices">https://www.theladders.com/career-advice/peak-end-rule-why-you-make-terrible-life-choices</a></figcaption></figure><p>Attractions Parcs like Disney Land have tons and tons of amazing attractions. And so long queues where people wait to be able to do them. After waiting, here we go, it’s time for the fun. It’s usually a <strong>short</strong> time of fun, excitement, and pleasure in comparison to the waiting queue.</p><p>When asking “how was the attraction?” to some peoples, many people may omit the annoying waiting queue in their discussion and just talk about how the attraction was, meaning good or bad.</p><p>That shows that how an experience ends is important and that’s what some people may remember the most from an experience.</p><p><strong>How does it apply to digital products?</strong></p><p>Many things can annoy users when using digital products: a survey, the reading of guidelines, a long and unterminable registration forms etc..</p><p>Does having a gift/surprise at the end as a “peak experience” can make the user forget about the boring but necessary survey/form? It can be a coupon, points or whatever makes the user feels like it was not worth nothing.</p><h4>Hick’s Law</h4><blockquote>The time it takes to make a decision increases with the number and complexity of choices. Users bombarded with choices have to take time to interpret and decide, giving them work they don’t want.</blockquote><figure><img alt="" src="https://cdn-images-1.medium.com/max/468/1*i1AS4ufXt0_Z4hqn3v8WiQ.png" /><figcaption>Source: <a href="https://www.dailymail.co.uk/sciencetech/article-508949/Gadgets-Why-complicated-electronic-devices-driving-mad.html">https://www.dailymail.co.uk/sciencetech/article-508949/Gadgets-Why-complicated-electronic-devices-driving-mad.html</a></figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/768/1*GBIdulEif9WkEO28kKVGTQ.png" /><figcaption>Netflix.com</figcaption></figure><p><strong>What are the common points between those two images?</strong></p><ul><li>They both have many options</li><li>Some of them can be complex</li></ul><p>For both examples, It can makes the user takes some time to do an action, or, for the case of Netflix, just giving up about watching something..</p><p><strong>How to reduce the complexity of choices?</strong></p><ol><li>One way is to focus on what is important</li></ol><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/1*nHql812n4YMRvJGeQyXIRQ.png" /><figcaption>Apple Remote</figcaption></figure><p>This example shows how focusing on what is important can help reduce complexity.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*OB6esKs1blN_QCfNPWm2GA.png" /></figure><p>For this second example, imagine having to look for a specific country among an unsorted and long list of all countries, compared to a sorted and group-by-continent list.. Using grouping of related informations can help reduce the complexity to find and achieve an action.</p><h4>Miller’s Law</h4><blockquote>The average person can only keep 7 (plus or minus 2) items in their working memory.</blockquote><figure><img alt="" src="https://cdn-images-1.medium.com/max/848/1*bosCGMfBqrN9WwyKXaVTEQ.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/976/1*ANYvvpGM2SdstwDBQlktzA.png" /></figure><p><strong>Which phone number is easier to remember? There is a big change that the second one may be easier, and here is why:</strong></p><p>Every time your brain has to remember information, whether it’s a phone number, how to navigate through a website, etc, there is a mental effort.</p><p>The problem is that the space in your brain in which this information is processed and stored is limited. Your brain begins to slow down or even abandon the task at hand when it receives more information than it can handle.</p><p>This overload can be due to too many choices, too much thought required, or lack of clarity.</p><p>Regarding the phone number, <strong>chucking </strong>is one way of reducing the overload/mental effort. It’s about grouping the long phone number into small group of information easy to remember.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/1*du5flPEziCaTEBIu0ccKPg.png" /></figure><h3>Conclusion</h3><p><a href="https://lawsofux.com">https://lawsofux.com</a> may be a great resource for improving your design-making process and also your arguments during design review since it’s based not on your personal taste or thinking but on scientific studies that can be used to demonstrate and add more value to your arguments when doing design review.t</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=fcd6c8ce7a0b" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/psychology-in-design-fcd6c8ce7a0b">Psychology in Design</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Animations challenges #2 — Asana Loader animation]]></title>
            <link>https://medium.com/eureka-engineering/animations-challenges-2-asana-loader-animation-c3a6d040f358?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/c3a6d040f358</guid>
            <category><![CDATA[swift]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[ios-development]]></category>
            <category><![CDATA[asana]]></category>
            <category><![CDATA[animation]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Wed, 30 Jan 2019 00:16:56 GMT</pubDate>
            <atom:updated>2019-05-09T10:14:35.939Z</atom:updated>
            <content:encoded><![CDATA[<h3>Animations challenges #2 — Asana Loader animation</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ga3kQIWkE54UwP_rXl5ipg.png" /></figure><blockquote>Animations challenges is actually pretty fun. It’s about picking a random animation in one of the app that we use everyday ( twitter, Facebook, Slack, others .. ) and try to recreate it together by doing Pair Programming.</blockquote><p>Animations are very funny. When you look at them, they look very simple, but when you try to look closer, you will notice that it involves many changes/sub-animations under the hood, making the end result amazing and almost unnoticed.</p><p>So the goal of those animations challenges is to try to analyse those animations and try to replicate them. After that, there will be a blog article showing how we approached this challenge.</p><p><strong>If it’s the first article you see from me</strong>, you can check it out the cool article there about the Bear iOS app search animation:</p><p><a href="https://medium.com/eureka-engineering/animations-challenges-1-bear-ios-search-animation-7ea5e4ea0a34">Animations challenges #1 — Bear iOS Search animation</a></p><p><strong>For this animation, I worked on it by myself, wihout </strong><a href="https://twitter.com/johnestropia"><strong>John</strong></a><strong>. This one is too easy for him 😚.</strong></p><h3>Introduction</h3><p>The second animation I decided to reproduce within one hour ( but at the end within a few minutes ) was the loading animation from the <a href="https://itunes.apple.com/app/id489969512">Asana iOS app</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*tqKQ5QfSrPDFaiB_yMmIOA.gif" /><figcaption>Asana iOS app</figcaption></figure><h3>Let’s dive in</h3><h4>Observation #1 — A gradient of two colors: Pink and beige</h4><p>First, we can see that there is probably two colors used in a horizontal gradient. How can we define this in code?</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/6628f6ed4c74f89e5f27694271d3850e/href">https://medium.com/media/6628f6ed4c74f89e5f27694271d3850e/href</a></iframe><ul><li>We define a <strong>pink</strong> and a <strong>beige</strong> color</li><li>We also create a gradient layer, aka <strong>CAGradientLayer</strong>.</li><li>We create an empty setup function. We will get back to this function later. It will take care of setting up the gradient looks we want.</li></ul><p><strong>CAGradientLayer </strong>is a subclass of <strong>CALayer</strong> and allows you to create a color gradient with as many colors as you want. But by default, it spreads colors uniformly 😮. But wait! That’s not a problem. There are many customizables properties that allow you to create the gradient color of your dream. Among the properties that we will use, there are:</p><ul><li><strong>startPoint</strong> and <strong>endPoint</strong> of the gradient. Basically it’s where the gradient starts and where it finishes. For exemple, a starting point of x=0.5,y=0 and an end point of x=0.5,y=1 means that the gradient will go from up to bottom. Another exemple: a starting point of x=0,y=0.5 and an end point of x=1,y=0.5 means that the gradient will go from left to right, etc..</li></ul><p>So let’s set up the <strong>startPoint </strong>and the<strong> endPoint </strong>so that it will be a<strong> </strong>gradient that will go from left to right:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/4e6ec86cf5aabcdc53a93fa84e73c3c3/href">https://medium.com/media/4e6ec86cf5aabcdc53a93fa84e73c3c3/href</a></iframe><p>PS: I set up three colors on the gradient colors property, so that we can clearly see the gradient at the left and right side of the beige color.</p><p>If we run the app, it will appears that way:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/700/1*slStrBnhR-QAy6LgVhob_w.png" /></figure><ul><li>The <strong>locations</strong>. It’s an array of values that will indicate where the gradients colors stops. You can think of it as <strong>how much each color should fill the gradient. </strong>For exemple, if we want the beige color to be at the end of the gradient ( which is the second color in the array we gave to the <strong>gradient.colors property </strong>), we can do something like:</li></ul><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/7c4896b393867dd97ae8fde4ee6cf339/href">https://medium.com/media/7c4896b393867dd97ae8fde4ee6cf339/href</a></iframe><p>And the result will be:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/694/1*ZMNMRjU17imeWt73xFStIQ.png" /></figure><p>It’s because we say that the first color ( pink ) will stop at 70%, then the second color ( beige ) will start and finish at 90%, and then the rest will be filled by the third color ( beige ).</p><p>If our case, we suppose that the beige color start at the beggining of the gradient and will be animated over time, so let’s set it up:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/ae93eb6f9ff0d81f41e8a800ce29954d/href">https://medium.com/media/ae93eb6f9ff0d81f41e8a800ce29954d/href</a></iframe><figure><img alt="" src="https://cdn-images-1.medium.com/max/690/1*aqWbGHk310iMmU4fekTXdQ.png" /><figcaption>Result of setup the beige gradient at the beginning</figcaption></figure><h4>Observation #2 — The beige color position is changing over time</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*K47IG9Sf678w3EGexb_8ng.gif" /><figcaption>Beige color position changing over time</figcaption></figure><p>We can see that the beige color position is changing over time, so how can we do that?</p><p>Well, if you look at the <strong>locations</strong> property documentation, it says:</p><blockquote><strong>locations</strong>: An optional array of NSNumber objects defining the location of each gradient stop. <strong>Animatable</strong>. <a href="https://developer.apple.com/documentation/quartzcore/cagradientlayer/1462410-locations">https://developer.apple.com/documentation/quartzcore/cagradientlayer/1462410-locations</a></blockquote><p>Animatable?? Wow! That’s amazing! It literally means that we can just change the <strong>locations </strong>property<strong> </strong>in a animation and the work will be done!</p><p>So let’s make it work right now!</p><p>To achieve that, we will use the convenient <strong>CABasicAnimation</strong> class that take care of<strong> animating CALayer properties</strong> very easily by defining a<strong>:</strong></p><ul><li><strong>fromValue</strong>: the starting value before the animation. In our case it’ll be the start position of our gradient colors ( the beige color starting at the beggining of the gradient ).</li><li><strong>toValue</strong>: the end value we want to reach at the end of the animation. In our case it’ll be the end position of our gradient colors ( the beige color moving at the end of the gradient )</li><li>the property name</li><li>any options you want</li></ul><p><em>So , for this article, let’s start the animation when the view appear.</em></p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/961cf9e7a9c463269a969fc0d393d4ad/href">https://medium.com/media/961cf9e7a9c463269a969fc0d393d4ad/href</a></iframe><p>And create our<strong> startLoading()</strong> function:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/2c15be002913b76f49b718a066f283a5/href">https://medium.com/media/2c15be002913b76f49b718a066f283a5/href</a></iframe><ul><li>The CABasicAnimation takes into parameter the property name</li><li>You can specify many things such as the <strong>duration</strong> of the animation or if your animation should loop using <strong>repeatCount</strong></li><li>The from and start value</li><li>and more..</li></ul><p>Let’s run the app.</p><h3>Final result</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*K47IG9Sf678w3EGexb_8ng.gif" /><figcaption>Final result</figcaption></figure><h3>✅ Good job for reading and reaching the end 💪</h3><h4>You can now reproduce a “similar” loading animation as the Asana iOS application</h4><p>You learned that sometimes, something that looks like a challenge, once putting enough through on it, can be achieved without struggling too much.</p><p>This is the second animation challenge blog article I write. If it’s the first article you see from me, you can check it out the cool <a href="https://link.medium.com/Qqn8poIfLT">article</a> about the Bear iOS app search animation.</p><p><strong>There is going to be more! If you have any animations you would like to see how to built it, don’t hesitate to add comments below :).</strong></p><p>You can find the full animation project there:</p><p><a href="https://github.com/Aymenworks/AnimationsChallenge/tree/master/AsanaLoadingView">Aymenworks/AnimationsChallenge</a></p><p>If any questions, you can reach <a href="http://twitter.com/aymenworks">me</a> or <a href="http://twitter.com/johnestropia">John</a> on twitter anytime :).</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c3a6d040f358" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/animations-challenges-2-asana-loader-animation-c3a6d040f358">Animations challenges #2 — Asana Loader animation</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Animations challenges #1 — Bear iOS Search animation]]></title>
            <link>https://medium.com/eureka-engineering/animations-challenges-1-bear-ios-search-animation-7ea5e4ea0a34?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/7ea5e4ea0a34</guid>
            <category><![CDATA[swift]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[animation]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Fri, 18 Jan 2019 06:40:17 GMT</pubDate>
            <atom:updated>2019-05-09T10:11:15.008Z</atom:updated>
            <content:encoded><![CDATA[<h3>Animations challenges #1 — Bear iOS Search animation</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vm5cpw4h5kn8nQcBgNKy_Q.png" /></figure><blockquote>Animations challenges is actually pretty fun. It’s about picking a random animation in one of the app that we use everyday ( twitter, Facebook, Slack, others .. ) and try to recreate it together by doing Pair Programming.</blockquote><p>December 12 was the first day for <a href="http://twitter.com/aymenworks">me</a> or <a href="http://twitter.com/johnestropia">John</a>, both iOS Engineer at <a href="https://eure.jp">Eureka</a>, for our Animations Challenges.</p><p>Animations are very funny. When you look at them, they look very simple, but when you try to look closer, you will notice that it involves many changes/sub-animations under the hood, making the end result amazing and almost unnoticed.</p><p>So the goal of those animations challenges is to try to analyse those animations and try to replicate them. After that, there will be a blog article showing how we approached this challenge.</p><p>Before diving into our first challenge, I would like to explain how we proceeed.</p><h3>Process and conditions</h3><p><strong>Duration</strong> =~1h<br><strong>Process</strong>: Pair programming and Pomodoro Technique</p><p>The <a href="https://en.wikipedia.org/wiki/Pomodoro_Technique">Pomodoro Technique</a> is about using a timer to break down work into interval. We try to switch every 15 minutes and allow each other to work on it.</p><p>It has great advantages, such as allowing both of us making technical decisions while the observer is asking questions making sure we are doing things right.</p><h3>Introduction</h3><p>The first animation we decided to reproduce within one hour was the search animation from the <a href="https://itunes.apple.com/us/app/bear/id1016366447?mt=8">Bear iOS app</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/534/1*xcn4R5aCXs89Fn9hqdUxVg.gif" /><figcaption><a href="https://itunes.apple.com/us/app/bear/id1016366447?mt=8">Bear iOS app</a></figcaption></figure><h3>Let’s dive in</h3><h4>Observation #1 — Three different states</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/162/1*5aWeU3KDSg2x5UmFacs7Og.png" /><figcaption>Three different states</figcaption></figure><p>First, we observed that it involves three different states:</p><ul><li>The<strong> search is empty</strong> = start or default state</li><li>The <strong>search button is ful</strong>l = full state</li><li>The <strong>search button is scaled</strong> = final state</li></ul><p>Let’s create a SearchView with properties that represents those three different states:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/a2d73d8b42525033277799ce2c7a6d87/href">https://medium.com/media/a2d73d8b42525033277799ce2c7a6d87/href</a></iframe><p><em>PS: We want the progress of the animation to be between 0 and 1.</em></p><h4>Observation #2 — Scroll down to reach the goal</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/534/1*xcn4R5aCXs89Fn9hqdUxVg.gif" /><figcaption><a href="https://itunes.apple.com/us/app/bear/id1016366447?mt=8">S</a>croll from top to bottom</figcaption></figure><p>Animations happens when we scroll from top to bottom, which means that we have to:</p><ul><li><strong>Decide when the goal (final state) is reached</strong>. For exemple, after scrolling about 50 points vertically, it may means that we finished</li><li><strong>Scrolling from top to bottom</strong> will generate a <strong>negative offset</strong>, so since we want to play with positive values, we cast it to a positive value</li><li>Since we are playing with a progress property, we want it to be <strong>between 0 and 1</strong> ( 0% a and 100% )</li></ul><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/9e1a011e641d374bab5bd0cc8d59ab9b/href">https://medium.com/media/9e1a011e641d374bab5bd0cc8d59ab9b/href</a></iframe><p>And let’s add an empty search function for now</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/fc58b7bf82da3796e6a67d9bab71c740/href">https://medium.com/media/fc58b7bf82da3796e6a67d9bab71c740/href</a></iframe><h4>Observation #3 — Two search buttons?</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/102/1*FKoZiBEfccO4PFR5cyclLg.png" /><figcaption>Two different colors at the same time</figcaption></figure><p>When looking at it closer, we see that while scrolling, we can see two different stroke colors at the same time for the search icon.</p><p>So based on that, we supposed that:</p><ul><li>There are maybe <strong>two buttons with the same size but different background and stroke color</strong>. The foreground button has a yellow dark background color with white stroke while the background button has a clear background color with a gray stroke</li><li><strong>There is a mask</strong> (which can be a CAShapeLayer ) that <strong>hide/show</strong> a part of the foreground button.</li><li>As the user scroll, we are going to play with the mask path to show and hide the foreground button depending on the progress of the animation</li></ul><p>So let’s fill our SearchView class now!</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/93a66b40be0bd1284bfe6d6aa75f6502/href">https://medium.com/media/93a66b40be0bd1284bfe6d6aa75f6502/href</a></iframe><p>And then let’s layout everything:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/960aa0780883df412115d12f2f77973c/href">https://medium.com/media/960aa0780883df412115d12f2f77973c/href</a></iframe><p>Finally, let’s call the setup method when the view is initialized:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/0961dc0d366ec1b3bfd68f25d5098de6/href">https://medium.com/media/0961dc0d366ec1b3bfd68f25d5098de6/href</a></iframe><h4>Observation #4 — Time to start filling</h4><p>On the <strong>Observation #2</strong>, we created an empty update function.<br>This function will take care of making this animation possible, and the first step is to start reproducing that “fill” animation</p><blockquote>There is a mask (which can be a CAShapeLayer ) that hide/show a part of the foreground button.</blockquote><p>The role of this mask layer is to show/hide the foreground button (the one with the yellow dark background color), based on the mask path, that’s how we can create this “fill” animation.</p><p>If the mask layer path is nil ( default value ), the view it’s associated with ( foreground button ) will not be visible. This is the case right now.</p><p>So let’s try to change this mask layer path properties based on the progress of the scroll and see what’s happening:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/ab7a8df9689a209e93eca74ff9fa997c/href">https://medium.com/media/ab7a8df9689a209e93eca74ff9fa997c/href</a></iframe><figure><img alt="" src="https://cdn-images-1.medium.com/max/96/1*dZ1eIkyrv4d0xi4w0SWzyg.gif" /><figcaption>Filling from top to bottom</figcaption></figure><p>Wow! It start looking like something already! Let’s just tweak a little bit the starting point of this so that the animation start from the bottom to the top as we want, instead of the opposite.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/051893c21d60e3f361554489dc386964/href">https://medium.com/media/051893c21d60e3f361554489dc386964/href</a></iframe><figure><img alt="" src="https://cdn-images-1.medium.com/max/102/1*O6mNk2wzjd4Z1LGbbwXDRg.gif" /><figcaption>Filling from bottom to top</figcaption></figure><p>We are almost there! Now we have to take care of the scaling!</p><h4>Observation #5 — Scaling to finish</h4><p>Now that we can reach the full state, we need to make it to the final state when it’s scaled.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/36b8e189c9d40d1448c5db1ce97aefb5/href">https://medium.com/media/36b8e189c9d40d1448c5db1ce97aefb5/href</a></iframe><p>Let’s define how big we want the button to grow/scale, where 1.0 is the original size.</p><p>But there is one thing..Our update function is already using the progress property to change the mask’s path and fill the button completely when the progress is equal to 1.</p><p>But if we look at our animation states, the full state should be reached at 0.7, not 1.0. And from 0.7 to 1.0, it should play the scale animation..</p><p>So we have to tweak a little the progress property by shifting it from the interval 0…1 to 0…7.</p><p>How can we move from a range of 0…1 to 0…7 ?</p><p>Let’s apply some math:</p><ul><li>(progress-newStartInterval) / (newEndInterval — newStartInterval)</li><li>(progress-0)/(0.7–0)</li></ul><p>The new update function become:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/38bf07e6b35ee1c77c2a91e5433b7b3f/href">https://medium.com/media/38bf07e6b35ee1c77c2a91e5433b7b3f/href</a></iframe><p>Last and final step is to add the scaling feature. And there is another range shifting going on: we want to shift the progress from 0…1 to 0.7…1</p><p>Let’s apply some math again:</p><ul><li>(progress-newStartInterval) / (newEndInterval — newStartInterval)</li><li>(progress-0.7)/(1–0.7)</li></ul><p>So let’s add at the end of the update function the new shifted progress used for scaling the button:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/e32ccc346278db90983bcb41854bb6b3/href">https://medium.com/media/e32ccc346278db90983bcb41854bb6b3/href</a></iframe><p>So on the code below, if the progress is bigger than 0.7, then we start scaling based on the new shifted progress, otherwise, we remove the scaling effect;</p><h3>Final Result</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/154/1*gw5kubgY7JH55oJ5US371g.gif" /><figcaption>FInal animation (square)</figcaption></figure><p>Oh! And if you wanna have the rounded animation like the original animation, then you can make your button a circle:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/12ddb6b828c36e934de9881be621c559/href">https://medium.com/media/12ddb6b828c36e934de9881be621c559/href</a></iframe><figure><img alt="" src="https://cdn-images-1.medium.com/max/136/1*vBu4lhZeJ9nbsBel2P6IXg.gif" /><figcaption>FInal animation (circle)</figcaption></figure><h3>✅ Good job for reading and reaching the end 💪</h3><h4>You can now reproduce a “similar” animation as the Bear iOS search application</h4><p>You learned that sometimes, something that looks like a challenge, once putting enough through on it, can be achieved without struggling too much.</p><p>This is the first animation challenge blog article I write, and there is going to be more! So if you have any feedbacks, let me know :).</p><p>You can find the full animation project there:</p><p><a href="https://github.com/Aymenworks/AnimationsChallenge/tree/master/ScrollToSearch">Aymenworks/AnimationsChallenge</a></p><p>If any questions, you can reach <a href="http://twitter.com/aymenworks">me</a> or <a href="http://twitter.com/johnestropia">John</a> on twitter anytime :).</p><p><strong>Edit</strong>: You can already find the second animation challenge right there:</p><p><a href="https://medium.com/eureka-engineering/animations-challenges-2-asana-loader-animation-c3a6d040f358">Animations challenges #2 — Asana Loader animation</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=7ea5e4ea0a34" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/animations-challenges-1-bear-ios-search-animation-7ea5e4ea0a34">Animations challenges #1 — Bear iOS Search animation</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Git tips and tricks #1]]></title>
            <link>https://medium.com/eureka-engineering/git-tips-and-tricks-1-48a11a6611c5?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/48a11a6611c5</guid>
            <category><![CDATA[git]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Tue, 04 Sep 2018 15:42:32 GMT</pubDate>
            <atom:updated>2018-09-04T15:42:32.447Z</atom:updated>
            <content:encoded><![CDATA[<blockquote><em>Git is the Google Doc for us, IT people, on IT projects. Every action you’re doing on a IT project using Git can be saved. Those actions can be the creation, update or deletion of a new feature for example. So it means that as you progress on the project, you can access any changes you have done so far in the past, very useful if you made a mistake and want to step back in a interior version of your project.</em></blockquote><blockquote><em>It works very well for team collaboration and allow us to create a better product.</em></blockquote><h3>Git tips and tricks #1</h3><p>You’ll find for this first <strong>Git tips and tricks #1 </strong>some of them that I think can be useful ( time to time, haha ).</p><h4>Who added this file ? 🤓</h4><p>When entering a project, I wanted to check if there was any .gitignore file before creating one. And yes, there was one, and by curiosity, I wanted to check who added it</p><p>I was able to figure it out by doing :</p><blockquote>git log — diff-filter=A — fileName</blockquote><blockquote>commit 4b8d6c960e0850273901f1e0dc9a5de0b3ab7d29<br>Author: Aymen Rebouh &lt;aymen***@gmail.com&gt;<br>Date: Wed Oct 19 11:51:20 2016 +0100</blockquote><blockquote>Add .gitignore file</blockquote><p>If you look at the value affected to the filter ` — diff-filter=A`, `A` means Add, so it works also with `D` for Delete, `M` for Modified, etc…</p><p>But this command line is not that easy to remember, right ? By adding this alias :</p><blockquote><em>git config — global alias.whoadded ‘log — diff-filter=A’</em></blockquote><p>You can now do :</p><blockquote>git whoadded .gitignore</blockquote><p>Better, right ?</p><h4>Git amend without without updating commit message 😎</h4><p>For those who don’t know the amend trick, it’s an option when committing that allows you to add new changes directly to your previous commit.</p><blockquote>git log</blockquote><blockquote>5cac4fe (HEAD -&gt; master) Add Test</blockquote><p>We can see that there is a commit with `Add Test` as title. <br>But what if you forgot to add for example comments to your code ?</p><p>One thing we generally do is making the changes and creating a new commit, something like :</p><blockquote>git add .<br>git commit -m “Add comments”<br>git log</blockquote><blockquote>9b20d05 (HEAD -&gt; master) Add comments for better use<br>5cac4fe Add Test</blockquote><p>But why creating a new commit for that and making your git history horrible, if one commit is enough ? By using the ` — amend` option, you can add all my current changes to the last commit and have a clean git history :</p><blockquote>git add .<br>git commit — amend<br>git log</blockquote><blockquote>5095e00 Add Test</blockquote><p>Your comments have been added to the `Add Test` commit, and you can see that the commit hash changed as well.</p><p>We’re not finished. When amending, git offers you the possibility to change the commit message (by default), and it can be annoying if it always open a text editor for that 😒.</p><p>So if you don’t want to change the commit message, you can add the ` — no-edit` option.</p><blockquote>✅ git — amend — no-edit</blockquote><h4>Last but not least 😈</h4><p>Well, it’s not an amazing trick, but this is how your manager can see how many lines of codes you wrote today :</p><blockquote>git diff — shortstat “@{0 day ago}”`</blockquote><blockquote>20 files changed, 4844 insertions(+), 4362 deletions(-)</blockquote><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=48a11a6611c5" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/git-tips-and-tricks-1-48a11a6611c5">Git tips and tricks #1</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[CIDetector and Pictures analysis]]></title>
            <link>https://medium.com/eureka-engineering/its-in-my-opinion-the-same-for-online-dating-and-your-profile-pictures-cc63bf3b7752?source=rss-82ceb7481588------2</link>
            <guid isPermaLink="false">https://medium.com/p/cc63bf3b7752</guid>
            <category><![CDATA[cidetector]]></category>
            <category><![CDATA[image-analysis]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[ios-app-development]]></category>
            <dc:creator><![CDATA[Aymen Rebouh]]></dc:creator>
            <pubDate>Mon, 03 Sep 2018 03:01:43 GMT</pubDate>
            <atom:updated>2018-09-03T03:56:21.551Z</atom:updated>
            <content:encoded><![CDATA[<blockquote>When you meet someone for the first time, It usually takes a few seconds for this person to have a first impression about you.</blockquote><h3>CIDetector and Pictures analysis</h3><p>It’s in my opinion the same for online dating and your profile pictures. Your profile pictures can have a big impact in your online dating experience because It’s what people see first.</p><p>There’s immense value in pictures analysis and there are so many interesting use cases. For example, some mobiles applications analyze photos and apply filters on it to make the pictures communicate more emotions like Snapchat. You can analyze photos and censure what may looks like unrelevant and/or forbidden content such as nudity. And a lot more 🤩.</p><p>I discovered <strong>CIDetector </strong>from the CoreImage frameworks and in this small article, I am going to show you what I discovered during my quick experimentation.</p><h3>Did you already use Core Image before?</h3><h4>Core Image is a powerful API built into Cocoa Touch.</h4><p>Personally, I don’t use it everyday. But it’s interesting to see that there are so incredible and useful features inside.</p><h4><strong>CoreImage — CIDetector: なにこれ？</strong></h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*I8C6yzGSnDuPQm7mFOiUYg.png" /></figure><p>CIDetector is an image processor object. So you have your CIDetector object. You give it any images and the CIDetector object will find for you information in your image: Those information can be:</p><ul><li>Faces</li><li>Rectangles</li><li>QRCode</li><li>Text</li></ul><p>For each of those information, you can again find some specific information. For face for exemple, you can find out if:</p><ul><li>There is a smile or not</li><li>The user is blinking or not</li><li>And probably other informations ( cf documentation apple )</li></ul><h4>Smile detection</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vcWsVO8q6krqH1EyrfQHIA.png" /></figure><p>So, we have this image and we want to find all the faces that appears. After than, we want to see if the person is smiling or not</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*SCppKL-PCUkBWM0sBOtwLA.png" /></figure><h4>How can we do it? Dō yatte suru no?</h4><h4>#1 Use CIDetector for detecting faces</h4><pre>let detector = CIDetector(<br>             ofType: <strong>CIDetectorTypeFace</strong>, <br>             context: nil, <br>             options: [CIDetectorAccuracy: CIDetectorAccuracyHigh]<br>)!</pre><h4>#2 Use CIDetector features for detecting Smile</h4><pre>let faces = detector.features(<br>                        in: CIImage(image: yourImage),<br>                        options: [<strong>CIDetectorSmile</strong>: true]) as? [CIFaceFeature]</pre><h4>#3 Then, do whatever you want with the results you got</h4><pre>For face in faces {<br>// <strong>face.bounds</strong>, <strong>face.hasSmile</strong>, <strong>face.mouthPosition</strong>, etc..<br>}</pre><p>Did you already have to analyze pictures for some reasons? Let me know on my twitter <a href="https://twitter.com/aymenworks">@aymenworks</a>.</p><p>I presented this small subject during the potatotips #54, a meetup where iOS and Android engineers share their tips, as simple as that 😅.</p><p>You can find the slides <a href="https://speakerdeck.com/aymenworks/swifty-range-operator-and-cidetector-power">there</a>.</p><p><a href="https://potatotips.connpass.com/event/95391/">potatotips #54 (iOS/Android開発Tips共有会) (2018/08/23 19:00〜)</a></p><p>Thanks for reading 🚀</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=cc63bf3b7752" width="1" height="1" alt=""><hr><p><a href="https://medium.com/eureka-engineering/its-in-my-opinion-the-same-for-online-dating-and-your-profile-pictures-cc63bf3b7752">CIDetector and Pictures analysis</a> was originally published in <a href="https://medium.com/eureka-engineering">Pairs Engineering</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>