Thursday, 19 February 2015

How to Install Scrapy in Linux with out Any Errors

While installing Scrapy (Python Crawl) you may get errors like below

1. src/lxml/lxml.etree.c:16:20: fatal error: Python.h: No such file or directory
    #include "Python.h"

2. Setup script exited with error: command 'gcc' failed with exit status 1

3. fatal error: openssl/aes.h: No such file or directory
    #include <openssl/aes.h>


4. c/_cffi_backend.c:13:17: fatal error: ffi.h: No such file or directory
    #include <ffi.h>

Above errors are problems of dependences while installing Scrapy

to avoid these error you need to install following packages (assuming you already have python2 and python3)

Note : based on distribution some times -devel work otherwise try -dev
example if python3-devel error then try python3-dev

install gcc
install python-devel
install python3-devel
install libevent-devel
install python3-setuptools (Optional)
install python-setuptools
install openssl
install libxslt-devel (if libxslt-devel error then try libxslt1-devel)
install libxml2-devel
install libffi-devel
install libssl-dev (if libssl-dev error then try openssl-devel)

Finally to install Scrapy
easy_install scrapy




Read More
Here We Write The Problems we face at different situations and Solutions to those problems.