IJOER-DEC-2016-14

Information about IJOER-DEC-2016-14

Published on January 6, 2017

Author: IJOER

Source: authorstream.com

Content

slide 1: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 99 Fusion of Empirical Wavelet Features for Object Recognition Murugan S 1 Dr Anjali Bhardwaj 2 Dr Ganeshbabu TR 3 1 Research scholar 2 Associate Professor 3 Professor 12 Department of ECE Maharishi University of Information Technology Lucknow India 3 Department of ECE Muthayammal Engineering College Rasipuram India Abstract — In this paper an approach to recognize object efficiently is presented based on empirical wavelet features. In many computer vision applications object recognition is required and it is a challenging task due to size and orientation of objects in the image. The proposed approach uses Empirical Wavelet Transform EWT to extract the characteristic of objects in an image. From the components of EWT energy and entropy features are extracted. Then K-nearest neighbor classifier is used to recognize the object in the given image. The results show that the fusion of energy and entropy features provides better classification accuracy of 99.81 where the energy and entropy features provide 98.42 and 98.97 respectively on the benchmark object database named Columbia Object Image Library Dataset COIL-100. Keywords — Object recognition Empirical wavelet transform energy features entropy features KNN classifier. I. INTRODUCTION Humans can easily recognize objects of varying size shape and orientations. However it is a challenging task for computers. To ease object recognition many automated approaches are developed recently. Some of them are addressed in this section. Learning strategy that models membership functions of the fuzzy attributes of surfaces is employed using GA 1 for object recognition. The objective function aims at enhancing recognition performance in terms of maximizing the degree of discrimination among classes. It is composed of three stages: retrieval and feature extraction of number of local parts from each model object modeling the objects by feature vectors and similarity measurement. A group-sensitive multiple kernel learning technique is used for object recognition to accommodate the inter-class correlation and intra-class diversity in 2. A midway representation between the individual images and object category is obtained. An optimization model to concurrently perform kernel dictionary learning and prototype selection is discussed in 3. The representation matrix is implemented to ensure that only a few samples are actually used to reconstruct the dictionary. So a convergent algorithm is employed to resolve the formulated non-convex optimization problem. Context model based object recognition is discussed in 4. It gives an efficient model that captures the information for more than a hundred object categories using a tree structure. It improves the performance of the system and also a coherent interpretation of a scene is obtained. Data driven un-falsified control is implemented for solving the drawbacks in visual servoing for object recognition in 5. It recognizes an object through matching image features. Supervisory visual servoing is implemented until an accord between the model and the selected features is achieved so that model recognition and object tracking are done successfully. Multiple kernel learning MKL is an approach for selecting and combining kernels functions for a given recognition task. For solving the optimization problems the state of MKL including different formulations and algorithms are discussed in 6 which focus on their applications to object recognition. Partial object recognition based on the corner point effective mapping is discussed in 7. The features are extracted by using the corner point analysis. Then neural network is used for recognition. A self adaptive module is discussed in 8 for object recognition. It consists of one selector and four passes. Among the four passes two are direct passes one residual pass and one maxout pass with different receptive fields and depths. And the selector is designed to help the user to choose reasonable output. A prototype robot is designed for pick and place an object in 9. Image processing concepts are used for recognition using arduino and MATLAB. In this paper an object recognition approach is presented based on EWT and KNN classifier. The organization of the paper is as follows: The mathematical background of EWT is given in Section 2 and the next section presents the proposed object recognition system. The results obtained by the proposed system using KNN classifier are discussed in section 4 and the conclusion is made in the final section. slide 2: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 100 II. EMPIRICAL WAVELET TRANSFORM Unlike in Fourier and wavelet transform the basis filters of EWT are not predefined and are a signal dependent method 10. It is based on the information content in the given image or signal. The Fourier spectrum in the range 0 to  is segmented into M number of parts. Band pass filters in each segment defines the empirical wavelets. Littlewood-Paley and Meyer’s wavelets are used as a bandpass filters with the empirical scaling function   W m  and the empirical wavelets   W m  can be described as                               otherwise W if F W if W m m m m m 0 1 1 2 cos 1 1           1                                                           otherwise W if F W if F W if W m m m m m m m m m 0 1 1 2 sin 1 1 2 cos 1 1 1 1 1 1 1                    2 where   m F   and   1  m F   can be expressed as                                      1 1 1 1 2 1 1 2 1 m m m m m m W F F W F F           3 The   z F satisfies the following criteria                      1 0 1 1 1 1 0 0 z z F z F z if z if z F 4 The EWT decomposition on 2D images 11 is described as follows. Let x denotes the image and the EWT decomposition consists of the following steps Step 1: Compute 1D Fourier transform of each row r of x Xr Ω and columns c of x XΩ c and calculate the mean row and column spectrum magnitudes as follows:       Rw N r Rw R r X N X 0 1       1 0 1 1 C N C C c c X N X 5 Where number of rows and columns are denoted by N RW and N Cl respectively. slide 3: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 101 Step 2: Perform boundaries detection on X R and X C and build the corresponding filter bank     R N m R m R 1 1    and     C N m C m C 1 1    respectively. N R and N C are the number of mean row and column sub-band respectively. Step 3: Filter x along the rows     R N m R m R 1 1    which provides N R +1 output images. Step 4: Filter NR+1 output images along the columns with     C N m C m C 1 1    this provides N R +1 N C +1 sub-band images. In this study EWT is used as a feature extraction technique. EWT is used for the diagnosis of glaucoma in medical image processing 12 and several extensions for the 1D adaptive wavelet frames to 2D signals images for EWT is explained in 11. III. PROPOSED SYSTEM The main aim of the proposed system is to recognize object in an image efficiently. It is composed of two computational blocks feature or information extraction and classification. An excellent feature extraction step will improve the accuracy of any recognition or classification system. This block produces set of salient features that represents the information required for the next stage. Figure 1 shows the feature or information extraction stage of the proposed EWT based object recognition system. FIGURE 1 FEATURE OR INFORMATION EXTRACTION STAGE OF THE PROPOSED EWT BASED OBJECT RECOGNITION SYSTEM EWT is a signal dependent decomposition technique and widely used in image processing applications such as medical image classification in 12. However EWT have not been studied for object recognition. Hence the proposed system uses EWT as feature extraction technique. From the components of EWT energy and entropy features are extracted and fused to form the feature vector. The energy and entropy features are defined in the following equations:      R i C j k k j i x RC EWT 1 1 1 6 where k x is the k th component of EWT decomposed image. R and C are the height and width of the image. Shannon entropy     i i i C C 2 2 log 7 Log Entropy    i i C 2 log 8 Sure Entropy         i i i C s e C 2 2 min   9 where i C is the coefficients of a particular component i with log0 0 and  is a positive threshold. It is obtained using the principle of Steins unbiased risk estimate 13. These features belongs to the same objects are extracted grouped and used in the classification stage. Figure 2 shows the classification stage of the proposed EWT based object recognition system. slide 4: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 102 FIGURE 2 CLASSIFICATION STAGE OF THE PROPOSED EWT BASED OBJECT RECOGNITION SYSTEM Classification is the final stage of the proposed system. The same technique used for extracting features of training images is applied to the testing image. KNN classifier is used for the classification. It is instance based classifier and hence there is no need for separate training stage. For testing the database obtained from the feature extraction phase is given as one of the input the classifier. K-nearest neighbor classification is performed by finding K nearest neighbors in the feature space defined by the given training feature database. Each neighbor votes on the classification of the unknown object. Each vote may be counted equally or more priority may be given to votes of the closest neighbors. It computes the Euclidean distance between the testing objects features with the database. The identity of object which has the minimum distance is retuned by the classifier. The Euclidean distance measures calculation is as follows: Let us consider   1 1 y x u  and   2 2 y x v  are two points. The Euclidean distance between these two points is given by Euclidean distance       22 1 1 2 2 u v x y x y     10 If the points have n-dimensions such as   1 2 3 n u x x x x  and   1 2 3 n v y y y y  then the generalized Euclidean distance formula between these points is Euclidean distance         2 22 1 1 2 2 nn u v x y x y x y        11 IV. RESULTS AND DISCUSSION To measure the performance of the proposed object recognition system COIL-100 14 database images are taken. Figure 3 shows the sample objects in the COIL database. It consists of 100 objects of 128 x128 pixels resolution. FIGURE 3 COIL DATABASE slide 5: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 103 Each object has 72 images captured using CCD colour camera with a 25 mm lens at every 5 degrees of rotation. Hence 7200 images are available for the analysis. For training and testing purpose the database is divided into two sets. It is based on the turn table rotation. In this study six predefined turntable rotations such as 10 20 30 45 60 and 90 are used for making training images and their corresponding testing images are tested by the proposed system. The classification accuracy is used to analyze the performance of the system. It is defined by 100 x tested objects of number total objects classified correctly of Number Accuracy  12 The accuracy of each object is computed using the above formula and average classification accuracy of 100 objects is obtained. Tables 1 to 3 show the performances of the EWT based object recognition system using energy entropy and fusion of both features respectively. TABLE 1 PERFORMANCE OF THE EWT BASED OBJECT RECOGNITION SYSTEM USING ENERGY FEATURES EWT Decomposition Level Average accuracy in percentage 10 0 20 0 30 0 45 0 60 0 90 0 2 86.64 77.72 71.85 65.63 56.77 50.44 3 86.78 78.50 73.00 66.19 59.79 52.25 4 89.31 80.65 75.98 69.23 62.92 57.37 5 94.92 88.24 82.13 73.81 67.52 60.25 6 98.42 93.50 88.68 82.41 75.71 68.31 TABLE 2 PERFORMANCE OF THE EWT BASED OBJECT RECOGNITION SYSTEM USING ENTROPY FEATURES EWT Decomposition Level Average accuracy in percentage 10 0 20 0 30 0 45 0 60 0 90 0 2 87.19 78.09 72.18 65.94 57.08 50.74 3 87.33 78.87 73.33 66.50 60.09 52.54 4 89.86 81.02 76.32 69.55 63.23 57.66 5 95.47 88.61 82.47 74.13 67.82 60.54 6 98.97 93.87 89.02 82.72 76.02 68.60 TABLE 3 PERFORMANCE OF THE EWT BASED OBJECT RECOGNITION SYSTEM USING THE FEATURE FUSION OF ENERGY AND ENTROPY FEATURES EWT Decomposition Level Average accuracy in percentage 10 0 20 0 30 0 45 0 60 0 90 0 2 88.03 78.65 72.68 66.41 57.53 51.18 3 88.17 79.43 73.83 66.97 60.55 52.99 4 90.69 81.57 76.82 70.02 63.68 58.10 5 96.31 89.17 82.97 74.59 68.27 60.99 6 99.81 94.43 89.52 83.19 76.47 69.04 It is observed from tables 1 to 3 that the fusion approach produces 99.81 accuracy which is higher than the accuracy of energy and entropy features. The average accuracy obtained by the proposed approach using energy and entropy features is 98.42 and 98.97 respectively. For all cases the maximum average accuracy is obtained at higher level EWT slide 6: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 104 decomposition i.e. at 6 th level. Table 4 shows the individual objects accuracy obtained by the EWT based object recognition system using fusion approach TABLE 4 INDIVIDUAL OBJECTS ACCURACY OBTAINED BY THE EWT BASED OBJECT RECOGNITION SYSTEM USING FUSION APPROACH Object Accuracy Object Accuracy Object Accuracy Object Accuracy 1 100.00 26 100.00 51 100.00 76 100.00 2 100.00 27 100.00 52 100.00 77 100.00 3 100.00 28 100.00 53 100.00 78 100.00 4 100.00 29 100.00 54 100.00 79 100.00 5 100.00 30 100.00 55 100.00 80 100.00 6 100.00 31 100.00 56 100.00 81 100.00 7 100.00 32 100.00 57 100.00 82 100.00 8 100.00 33 100.00 58 100.00 83 100.00 9 100.00 34 100.00 59 100.00 84 97.22 10 100.00 35 100.00 60 100.00 85 100.00 11 100.00 36 100.00 61 100.00 86 100.00 12 100.00 37 100.00 62 100.00 87 100.00 13 100.00 38 100.00 63 100.00 88 100.00 14 100.00 39 100.00 64 100.00 89 100.00 15 100.00 40 100.00 65 100.00 90 100.00 16 100.00 41 100.00 66 100.00 91 97.22 17 100.00 42 100.00 67 94.44 92 100.00 18 100.00 43 100.00 68 100.00 93 100.00 19 100.00 44 100.00 69 94.44 94 100.00 20 100.00 45 100.00 70 100.00 95 100.00 21 97.22 46 100.00 71 100.00 96 100.00 22 100.00 47 100.00 72 100.00 97 100.00 23 100.00 48 100.00 73 100.00 98 100.00 24 100.00 49 100.00 74 100.00 99 100.00 25 100.00 50 100.00 75 100.00 100 100.00 Average 99.81 It is inferred from table 4 that among the 100 objects in the COIL database only 5 objects are misclassified. Also the accuracy of each object is over 94. It is concluded that for better classification of different objects in the COIL database the 6 th level energy and entropy features are selected by the proposed system. V. CONCLUSION In this paper an approach for the recognition 100 objects in the COIL-100 database is presented using EWT and KNN. As EWT gives a better approximation of images than DWT it produces an excellent performance for object recognition. From the EWT decomposed images energy and entropy features are extracted. They are fused together and given to the KNN classifier for classification. Experimental results show that the proposed fusion approach produces 99.81 accuracy. Also it is clearly observed that the fusion of energy and entropy features gives the highest accuracy than their individual counterpart. REFERENCES 1 Liu Y. H. Lee A. J and Chang F. "Object Recognition using Discriminative Parts" Computer Vision and Image Understanding pp. 854-867 2012. 2 Yang Jingjing et al. "Group-Sensitive Multiple Kernel Learning for Object Recognition" IEEE Transactions on Image Processing vol. 21 no. 5 pp. 2838-2852 2015. 3 Liu Huaping and Fuchun Sun "Online Kernel Dictionary Learning for Object Recognition" IEEE International Conference on Automation Science and Engineering pp. 268-273 2016. slide 7: International Journal of Engineering Research Science IJOER ISSN: 2395-6992 Vol-2 Issue-12 December- 2016 Page | 105 4 Choi Myung Jin Antonio Torralba and Alan S. Willsky “A Tree-Based Context Model for Object Recognition" IEEE transactions on pattern analysis and machine intelligence vol. 34 no. 2 pp. 240-252 2012. 5 Jiang Ping et al. "Unfalsified Visual Servoing for Simultaneous Object Recognition and Pose Tracking" IEEE Transactions on Cybernetics vol. 46 no. 12 pp. 3032-3046 2016. 6 Bucak Serhat S. Rong Jin and Anil K. Jain. "Multiple Kernel Learning for Visual Object Recognition: A Review" IEEE Transactions on Pattern Analysis and Machine Intelligence vol. 36 no. 7 pp. 1354-1369 2014. 7 Poonam and Sharma M. "A Corner Feature Adaptive Neural Network Model for Partial Object Recognition" 4 th International Conference on Reliability Infocom Technologies and Optimization pp. 1-6 2015. 8 Wang Zhenyang Zhidong Deng and Shiyao Wang. "SAM: A Rethinking of Prominent Convolutional Neural Network Architectures for Visual Object Recognition" IEEE International Joint Conference on Neural Networks pp. 1008-1014 2016. 9 Sahu Uma et al. "LIBO: The Grasping Robot Using Object Recognition" International Conference on Electrical Electronics and Optimization Techniques pp.1-6 2016. 10 Gilles J. “Empirical Wavelet Transform” IEEE Transactions on Signal Processing vol. 61 no. 16 pp. 3999–4010 2013. 11 Gilles J. Tran G. and Osher S. “2D Empirical Transforms: Wavelets Ridgelets and Curvelets Revisited” SIAM Journal on Imaging Sciences vol. 7 no. 1 pp. 157–186 2014. 12 Maheshwari Shishir Ram Bilas Pachori and Rajendra Acharya U. "Automated Diagnosis of Glaucoma using Empirical Wavelet Transform and Correntropy Features Extracted from Fundus Images" IEEE Journal of Biomedical and Health Informatics vol.99 pp. 2168-2194 2016. 13 Donoho D.L. and Johnstone I.M. “Adapting to Unknown Smoothness via Wavelet Shrinkage” Journal of the American Statistical Association vol. 90 no. 432 pp. 1200-1224 1995. 14 COIL Database: http://www.cs.columbia.edu/CAVE/software/softlib/coil-100.php

Related presentations


Other presentations created by IJOER

IJOER-MAY-2016-40
06. 06. 2016
0 views

IJOER-MAY-2016-40

IJOER-MAY-2016-62
06. 06. 2016
0 views

IJOER-MAY-2016-62

IJOER-MAY-2016-61
06. 06. 2016
0 views

IJOER-MAY-2016-61

IJOER-MAY-2016-64
06. 06. 2016
0 views

IJOER-MAY-2016-64

IJOER-JUL-2015-14
01. 08. 2015
0 views

IJOER-JUL-2015-14

IJOER-JUL-2015-13
01. 08. 2015
0 views

IJOER-JUL-2015-13

IJOER-JUL-2015-10
01. 08. 2015
0 views

IJOER-JUL-2015-10

IJOER-JUL-2015-3
01. 08. 2015
0 views

IJOER-JUL-2015-3

IJOER-MAY-2016-52
03. 06. 2016
0 views

IJOER-MAY-2016-52

IJOER-MAY-2016-17
03. 06. 2016
0 views

IJOER-MAY-2016-17

IJOER-MAY-2016-60
03. 06. 2016
0 views

IJOER-MAY-2016-60

IJOER-SEP-2015-5
03. 10. 2015
0 views

IJOER-SEP-2015-5

IJOER-AUG-2015-10
03. 10. 2015
0 views

IJOER-AUG-2015-10

IJOER-AUG-2015-21
31. 08. 2015
0 views

IJOER-AUG-2015-21

IJOER-AUG-2015-19
31. 08. 2015
0 views

IJOER-AUG-2015-19

IJOER-AUG-2015-18
31. 08. 2015
0 views

IJOER-AUG-2015-18

IJOER-AUG-2015-17
31. 08. 2015
0 views

IJOER-AUG-2015-17

IJOER-AUG-2015-11
31. 08. 2015
0 views

IJOER-AUG-2015-11

IJOER-NOV-2015-2
05. 12. 2015
0 views

IJOER-NOV-2015-2

IJOER-NOV-2015-21
05. 12. 2015
0 views

IJOER-NOV-2015-21

IJOER-NOV-2015-9
05. 12. 2015
0 views

IJOER-NOV-2015-9

IJOER-NOV-2015-4
05. 12. 2015
0 views

IJOER-NOV-2015-4

IJOER-OCT-2015-27
05. 12. 2015
0 views

IJOER-OCT-2015-27

IJOER-DEC-2015-26
02. 01. 2016
0 views

IJOER-DEC-2015-26

IJOER-DEC-2015-25
02. 01. 2016
0 views

IJOER-DEC-2015-25

IJOER-DEC-2015-23
02. 01. 2016
0 views

IJOER-DEC-2015-23

IJOER-DEC-2015-20
02. 01. 2016
0 views

IJOER-DEC-2015-20

IJOER-DEC-2015-19
02. 01. 2016
0 views

IJOER-DEC-2015-19

IJOER-DEC-2015-30
02. 01. 2016
0 views

IJOER-DEC-2015-30

IJOER-DEC-2015-28
02. 01. 2016
0 views

IJOER-DEC-2015-28

IJOER-DEC-2015-14
02. 01. 2016
0 views

IJOER-DEC-2015-14

IJOER-DEC-2015-13
02. 01. 2016
0 views

IJOER-DEC-2015-13

IJOER-DEC-2015-6
02. 01. 2016
0 views

IJOER-DEC-2015-6

IJOER-NOV-2015-20
02. 01. 2016
0 views

IJOER-NOV-2015-20

IJOER-NOV-2015-19
02. 01. 2016
0 views

IJOER-NOV-2015-19

IJOER-DEC-2015-42
02. 01. 2016
0 views

IJOER-DEC-2015-42

IJOER-DEC-2015-38
02. 01. 2016
0 views

IJOER-DEC-2015-38

IJOER-DEC-2015-33
02. 01. 2016
0 views

IJOER-DEC-2015-33

IJOER-DEC-2015-31
02. 01. 2016
0 views

IJOER-DEC-2015-31

IJOER-DEC-2015-39
03. 01. 2016
0 views

IJOER-DEC-2015-39

IJOER-NOV-2015-17
31. 01. 2016
0 views

IJOER-NOV-2015-17

IJOER-JAN-2016-45
31. 01. 2016
0 views

IJOER-JAN-2016-45

IJOER-JAN-2016-44
31. 01. 2016
0 views

IJOER-JAN-2016-44

IJOER-JAN-2016-41
31. 01. 2016
0 views

IJOER-JAN-2016-41

IJOER-JAN-2016-40
31. 01. 2016
0 views

IJOER-JAN-2016-40

IJOER-JAN-2016-33
31. 01. 2016
0 views

IJOER-JAN-2016-33

IJOER-JAN-2016-30
31. 01. 2016
0 views

IJOER-JAN-2016-30

IJOER-JAN-2016-28
31. 01. 2016
0 views

IJOER-JAN-2016-28

IJOER-JAN-2016-22
31. 01. 2016
0 views

IJOER-JAN-2016-22

IJOER-JAN-2016-54
03. 02. 2016
0 views

IJOER-JAN-2016-54

IJOER-JAN-2016-9
31. 01. 2016
0 views

IJOER-JAN-2016-9

IJOER-DEC-2015-37
31. 01. 2016
0 views

IJOER-DEC-2015-37

IJOER-FEB-2016-46
03. 03. 2016
0 views

IJOER-FEB-2016-46

IJOER-FEB-2016-38
03. 03. 2016
0 views

IJOER-FEB-2016-38

IJOER-FEB-2016-33
03. 03. 2016
0 views

IJOER-FEB-2016-33

IJOER-FEB-2016-31
03. 03. 2016
0 views

IJOER-FEB-2016-31

IJOER-FEB-2016-25
03. 03. 2016
0 views

IJOER-FEB-2016-25

IJOER-FEB-2016-20
03. 03. 2016
0 views

IJOER-FEB-2016-20

IJOER-MAR-2016-46
16. 04. 2016
0 views

IJOER-MAR-2016-46

IJOER-MAR-2016-41
02. 05. 2016
0 views

IJOER-MAR-2016-41

IJOER-MAR-2016-36
02. 05. 2016
0 views

IJOER-MAR-2016-36

IJOER-MAR-2016-29
02. 05. 2016
0 views

IJOER-MAR-2016-29

IJOER-MAR-2016-23
02. 05. 2016
0 views

IJOER-MAR-2016-23

IJOER-MAR-2016-14
02. 05. 2016
0 views

IJOER-MAR-2016-14

IJOER-MAR-2016-2
02. 05. 2016
0 views

IJOER-MAR-2016-2

IJOER-FEB-2016-66
02. 05. 2016
0 views

IJOER-FEB-2016-66

IJOER-FEB-2016-41
02. 05. 2016
0 views

IJOER-FEB-2016-41

IJOER-APR-2016-35
02. 05. 2016
0 views

IJOER-APR-2016-35

IJOER-MAY-2016-6
30. 05. 2016
0 views

IJOER-MAY-2016-6

IJOER-MAY-2016-3
30. 05. 2016
0 views

IJOER-MAY-2016-3

IJOER-MAY-2016-22
30. 05. 2016
0 views

IJOER-MAY-2016-22

IJOER-MAY-2016-20
30. 05. 2016
0 views

IJOER-MAY-2016-20

IJOER-MAY-2016-16
30. 05. 2016
0 views

IJOER-MAY-2016-16

IJOER-MAY-2016-11
30. 05. 2016
0 views

IJOER-MAY-2016-11

IJOER-MAY-2016-1
30. 05. 2016
0 views

IJOER-MAY-2016-1

IJOER-FEB-2016-36
30. 05. 2016
0 views

IJOER-FEB-2016-36

IJOER-DEC-2015-36
29. 06. 2016
0 views

IJOER-DEC-2015-36

IJOER-JUN-2016-19
30. 06. 2016
0 views

IJOER-JUN-2016-19

IJOER-JUN-2016-8
30. 06. 2016
0 views

IJOER-JUN-2016-8

IJOER-JUN-2016-7
30. 06. 2016
0 views

IJOER-JUN-2016-7

IJOER-JUN-2016-3
30. 06. 2016
0 views

IJOER-JUN-2016-3

IJOER-JUN-2016-2
30. 06. 2016
0 views

IJOER-JUN-2016-2

IJOER-JUN-2016-34
30. 06. 2016
0 views

IJOER-JUN-2016-34

IJOER-JUN-2016-32
30. 06. 2016
0 views

IJOER-JUN-2016-32

IJOER-JUN-2016-29
30. 06. 2016
0 views

IJOER-JUN-2016-29

IJOER-JUN-2016-28
30. 06. 2016
0 views

IJOER-JUN-2016-28

IJOER-JUN-2016-25
30. 06. 2016
0 views

IJOER-JUN-2016-25

Engineering Journal
30. 06. 2016
0 views

Engineering Journal

IJOER-MAY-2016-50
30. 06. 2016
0 views

IJOER-MAY-2016-50

IJOER-JUN-2016-36
30. 06. 2016
0 views

IJOER-JUN-2016-36

IJOER-JUN-2016-27
30. 06. 2016
0 views

IJOER-JUN-2016-27

IJOER-JUL-2016-16
03. 08. 2016
0 views

IJOER-JUL-2016-16

IJOER-JUL-2016-15
03. 08. 2016
0 views

IJOER-JUL-2016-15

IJOER-JUL-2016-28
03. 08. 2016
0 views

IJOER-JUL-2016-28

IJOER-JUL-2016-26
03. 08. 2016
0 views

IJOER-JUL-2016-26

IJOER-JUL-2016-25
03. 08. 2016
0 views

IJOER-JUL-2016-25

IJOER-JUL-2016-52
04. 08. 2016
0 views

IJOER-JUL-2016-52

IJOER-JUL-2016-37
04. 08. 2016
0 views

IJOER-JUL-2016-37

IJOER-JUL-2016-42
03. 08. 2016
0 views

IJOER-JUL-2016-42

IJOER-JUL-2016-39
03. 08. 2016
0 views

IJOER-JUL-2016-39

IJOER-JUL-2016-31
03. 08. 2016
0 views

IJOER-JUL-2016-31

IJOER-JUL-2016-29
03. 08. 2016
0 views

IJOER-JUL-2016-29

IJOER-JUL-2016-13
03. 08. 2016
0 views

IJOER-JUL-2016-13

IJOER-JUL-2016-57
01. 09. 2016
0 views

IJOER-JUL-2016-57

IJOER-JUL-2016-48
01. 09. 2016
0 views

IJOER-JUL-2016-48

IJOER-JUL-2016-38
01. 09. 2016
0 views

IJOER-JUL-2016-38

IJOER-JUL-2016-3
01. 09. 2016
0 views

IJOER-JUL-2016-3

IJOER-FEB-2016-24
01. 09. 2016
0 views

IJOER-FEB-2016-24

IJOER-AUG-2016-40
01. 09. 2016
0 views

IJOER-AUG-2016-40

IJOER-AUG-2016-38
01. 09. 2016
0 views

IJOER-AUG-2016-38

IJOER-AUG-2016-39
05. 09. 2016
0 views

IJOER-AUG-2016-39

IJOER-AUG-2016-12
05. 09. 2016
0 views

IJOER-AUG-2016-12

IJOER-AUG-2016-8
02. 10. 2016
0 views

IJOER-AUG-2016-8

IJOER-OCT-2016-33
02. 11. 2016
0 views

IJOER-OCT-2016-33

IJOER-OCT-2016-24
02. 11. 2016
0 views

IJOER-OCT-2016-24

IJOER-OCT-2016-18
02. 11. 2016
0 views

IJOER-OCT-2016-18

IJOER-OCT-2016-16
02. 11. 2016
0 views

IJOER-OCT-2016-16

IJOER-OCT-2016-15
02. 11. 2016
0 views

IJOER-OCT-2016-15

IJOER-OCT-2016-13
02. 11. 2016
0 views

IJOER-OCT-2016-13

IJOER-OCT-2016-12
02. 11. 2016
0 views

IJOER-OCT-2016-12

IJOER-OCT-2016-8
02. 11. 2016
0 views

IJOER-OCT-2016-8

IJOER-OCT-2016-7
02. 11. 2016
0 views

IJOER-OCT-2016-7

IJOER-OCT-2016-3
02. 11. 2016
0 views

IJOER-OCT-2016-3

IJOER-NOV-2016-27
03. 12. 2016
0 views

IJOER-NOV-2016-27

IJOER-OCT-2016-41
03. 12. 2016
0 views

IJOER-OCT-2016-41

IJOER-OCT-2016-39
03. 12. 2016
0 views

IJOER-OCT-2016-39

IJOER-NOV-2016-24
03. 12. 2016
0 views

IJOER-NOV-2016-24

IJOER-NOV-2016-16
03. 12. 2016
0 views

IJOER-NOV-2016-16

IJOER-NOV-2016-4
03. 12. 2016
0 views

IJOER-NOV-2016-4

IJOER-DEC-2016-18
06. 01. 2017
0 views

IJOER-DEC-2016-18

IJOER-DEC-2016-15
06. 01. 2017
0 views

IJOER-DEC-2016-15

IJOER-DEC-2016-11
06. 01. 2017
0 views

IJOER-DEC-2016-11

IJOER-DEC-2016-10
06. 01. 2017
0 views

IJOER-DEC-2016-10

IJOER-DEC-2016-9
06. 01. 2017
0 views

IJOER-DEC-2016-9

IJOER-DEC-2016-5
06. 01. 2017
0 views

IJOER-DEC-2016-5

IJOER-DEC-2016-3
06. 01. 2017
0 views

IJOER-DEC-2016-3

IJOER-OCT-2016-1
06. 01. 2017
0 views

IJOER-OCT-2016-1

IJOER-NOV-2016-30
06. 01. 2017
0 views

IJOER-NOV-2016-30

IJOER-NOV-2016-29
06. 01. 2017
0 views

IJOER-NOV-2016-29

IJOER-NOV-2016-25
06. 01. 2017
0 views

IJOER-NOV-2016-25

IJOER-DEC-2016-13
06. 02. 2017
0 views

IJOER-DEC-2016-13

IJOER-DEC-2016-8
06. 02. 2017
0 views

IJOER-DEC-2016-8

IJOER-SEP-2016-29
07. 02. 2017
0 views

IJOER-SEP-2016-29

IJOER-JAN-2017-10
07. 02. 2017
0 views

IJOER-JAN-2017-10

IJOER-JAN-2017-9
07. 02. 2017
0 views

IJOER-JAN-2017-9

IJOER-JAN-2017-8
07. 02. 2017
0 views

IJOER-JAN-2017-8

IJOER-JAN-2017-7
07. 02. 2017
0 views

IJOER-JAN-2017-7

IJOER-DEC-2016-20
07. 02. 2017
0 views

IJOER-DEC-2016-20

IJOER-MAR-2017-17
02. 04. 2017
0 views

IJOER-MAR-2017-17

IJOER-MAR-2017-5
02. 04. 2017
0 views

IJOER-MAR-2017-5

IJOER-MAR-2017-4
02. 04. 2017
0 views

IJOER-MAR-2017-4

IJOER-MAR-2017-1
02. 04. 2017
0 views

IJOER-MAR-2017-1

IJOER-MAR-2017-8
02. 04. 2017
0 views

IJOER-MAR-2017-8

IJOER-MAR-2017-6
02. 04. 2017
0 views

IJOER-MAR-2017-6

IJOER-JAN-2017-11
02. 04. 2017
0 views

IJOER-JAN-2017-11

IJOER-FEB-2017-15
02. 04. 2017
0 views

IJOER-FEB-2017-15

IJOER-FEB-2017-8
02. 04. 2017
0 views

IJOER-FEB-2017-8

IJOER-MAR-2017-26
02. 04. 2017
0 views

IJOER-MAR-2017-26

IJOER-MAR-2017-23
02. 04. 2017
0 views

IJOER-MAR-2017-23

IJOER-MAR-2017-13
02. 04. 2017
0 views

IJOER-MAR-2017-13

IJOER-APR-2017-21
05. 05. 2017
0 views

IJOER-APR-2017-21

IJOER-APR-2017-19
05. 05. 2017
0 views

IJOER-APR-2017-19

IJOER-APR-2017-18
05. 05. 2017
0 views

IJOER-APR-2017-18

IJOER-MAY-2017-29
10. 06. 2017
0 views

IJOER-MAY-2017-29

IJOER-MAY-2017-28
10. 06. 2017
0 views

IJOER-MAY-2017-28

IJOER-MAY-2017-31
10. 06. 2017
0 views

IJOER-MAY-2017-31

IJOER-MAY-2017-23
10. 06. 2017
0 views

IJOER-MAY-2017-23

IJOER-MAY-2017-14
10. 06. 2017
0 views

IJOER-MAY-2017-14

IJOER-MAY-2017-12
10. 06. 2017
0 views

IJOER-MAY-2017-12

IJOER-MAY-2017-8
10. 06. 2017
0 views

IJOER-MAY-2017-8

IJOER-MAY-2017-6
10. 06. 2017
0 views

IJOER-MAY-2017-6

IJOER-APR-2017-15
10. 06. 2017
0 views

IJOER-APR-2017-15

IJOER-MAY-2017-25
08. 07. 2017
0 views

IJOER-MAY-2017-25

IJOER-JUN-2017-12
08. 07. 2017
0 views

IJOER-JUN-2017-12

IJOER-JUN-2017-10
08. 07. 2017
0 views

IJOER-JUN-2017-10

IJOER-SEP-2017-17
10. 11. 2017
0 views

IJOER-SEP-2017-17

IJOER-OCT-2017-14
10. 11. 2017
0 views

IJOER-OCT-2017-14

IJOER-OCT-2017-13
10. 11. 2017
0 views

IJOER-OCT-2017-13

IJOER-OCT-2017-11
10. 11. 2017
0 views

IJOER-OCT-2017-11

IJOER-SEP-2017-20
10. 11. 2017
0 views

IJOER-SEP-2017-20

IJOER-OCT-2017-9
10. 11. 2017
0 views

IJOER-OCT-2017-9

IJOER-OCT-2017-8
10. 11. 2017
0 views

IJOER-OCT-2017-8

IJOER-OCT-2017-3
10. 11. 2017
0 views

IJOER-OCT-2017-3

IJOER-NOV-2017-16
09. 12. 2017
0 views

IJOER-NOV-2017-16

IJOER-NOV-2017-15
09. 12. 2017
0 views

IJOER-NOV-2017-15

IJOER-NOV-2017-14
09. 12. 2017
0 views

IJOER-NOV-2017-14

IJOER-NOV-2017-13
09. 12. 2017
0 views

IJOER-NOV-2017-13

IJOER-NOV-2017-12
09. 12. 2017
0 views

IJOER-NOV-2017-12

IJOER-NOV-2017-11
09. 12. 2017
0 views

IJOER-NOV-2017-11

IJOER-NOV-2017-10
09. 12. 2017
0 views

IJOER-NOV-2017-10

IJOER-NOV-2017-9
09. 12. 2017
0 views

IJOER-NOV-2017-9

IJOER-DEC-2017-11
01. 01. 2018
0 views

IJOER-DEC-2017-11

IJOER-DEC-2017-10
01. 01. 2018
0 views

IJOER-DEC-2017-10

IJOER-DEC-2017-7
01. 01. 2018
0 views

IJOER-DEC-2017-7

IJOER-DEC-2017-4
01. 01. 2018
0 views

IJOER-DEC-2017-4

IJOER-DEC-2017-2
01. 01. 2018
0 views

IJOER-DEC-2017-2

IJOER-DEC-2017-9
01. 01. 2018
0 views

IJOER-DEC-2017-9

IJOER-DEC-2017-6
01. 01. 2018
0 views

IJOER-DEC-2017-6

IJOER-DEC-2017-5
01. 01. 2018
0 views

IJOER-DEC-2017-5

IJOER-DEC-2017-1
31. 12. 2017
0 views

IJOER-DEC-2017-1

IJOER-MAR-2018-1
12. 05. 2018
0 views

IJOER-MAR-2018-1

IJOER-MAY-2018-14
31. 05. 2018
0 views

IJOER-MAY-2018-14

IJOER-MAY-2018-12
31. 05. 2018
0 views

IJOER-MAY-2018-12

IJOER-MAY-2018-9
31. 05. 2018
0 views

IJOER-MAY-2018-9