Search Results

Search found 7807 results on 313 pages for 'dreamweaver sites'.

Page 45/313 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • Web Developer Portfolio - Role Definition

    - by GSTAR
    I'm just putting together a portfolio that lists all the websites I have developed / worked on during the past year. Now this has become quite a long list - simply because 60% of the websites I have listed are ones where I have developed certain sections of the site, or maybe re-developed a certain section of an existing site - but not actually developed the whole site. So basically you could say I made a 20-50% contribution on those particular sites. I don't want to give the false impression to a potential employer that I have actually fully developed all the sites listed on my portfolio. Therefore I am after a neat way to indicate this fact. On the websites that I have fully developed, I have put a small label next to the name which reads "Lead Developer". What would be the equivalent label to put on the sites I have partially developed, or projects where I have been amongst multiple people developing? I suppose what I'm asking is, how would you define, in 2-3 words a non-lead developer role within a project?

    Read the article

  • .aspx websites: Is it built using web forms?

    - by Lazeera
    I visit many website which I think is built using ASP.NET web forms because of the extension (.aspx). When I view source of these website I see at least one or two something like: <input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE"> or wvcD4NCjxwPtin2YTZh9iv2YrYqSDYp9mE2KvYp9mG2YrYqSDZh9mKINit2..... However, yesterday I visited two sites on is the 'ASP.NET forums - http://forums.asp.net' and the other is 'POF'. The extension of these sites is still (.aspx) but when I view the source of these site I could not find any <input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE"> nor wvcD4NCjxwPtin2YTZh9iv2YrYqSDYp9mE2KvYp9mG2YrYqSDZh9mKINit2..... Now, I would like to know how those sites use ASP.NET Web Forms and their final HTML output is still clean?

    Read the article

  • Google I/O 2010 - Connect enterprise apps w/ Google Docs

    Google I/O 2010 - Connect enterprise apps w/ Google Docs Google I/O 2010 - Connecting your enterprise applications with Google Docs and Sites Enterprise 201 Eric Bidelman, Vijay Bangaru, Matthew Tonkin (Memeo Inc) Learn how your organization can harness the power of Google Docs and Sites directly from within your existing enterprise systems using our extensive APIs. Integrate with data from behind the firewall using Secure Data Connector. Upload, share, collaborate, and sync any file to Docs. Even automate the creation of project and team workspaces in a single click in Sites from within your CRM. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 18 0 ratings Time: 53:19 More in Science & Technology

    Read the article

  • Connman not connecting to internet

    - by sagarchalise
    I wanted to try connman as I heard it will be replacing network manager. I kinda liked its looks and feel but the problem is I donot get connected to internet. I am using ADSL and it connects to my adsl router but doesnot ping google or any other sites. I tried to change the default gateway as well as use static ips but to no avail. It displays I am wired but doesnot open any sites or ping to any external sites except for 198.168.1.1 which is my router ip. Can anyone help ? Basically it is doing alright with network-manager as well as wicd

    Read the article

  • PHP and performance

    - by Naif
    I always hear that PHP is for medium and small websites whereas .NET and Java for enterprise applications. My question is about PHP. Why is PHP not a good option for enterprise web applications? Is it because if the web application becomes bigger then PHP will be slower as it is an interpreted language? I know that corporate world will choose .NET or J2EE because of the integration with their products and because of back end services, etc. However, if we just have PHP for building sites and web applications then how can we use it to perform well with big sites? In short, Is there a relationship between the performance of PHP and the size of the website? What are the factors that make PHP not appropriate option for big sites?

    Read the article

  • Displaying thumbnails in google search results for flash games?

    - by serg
    Some sites are somehow displaying video preview thumbnails for pages that contain flash games in google search results. For example search for: site:www.thorgaming.com game I see this only on small sites which makes me believe google is not very happy about it. How do they do it and is it ok with google? I assume they are submitting thumbnails through video sitemaps, but I can't find any information about using them with flash games. I also run a few such sites through rich snipped testing tool and it didn't detect any microdata tags on a page.

    Read the article

  • Should I use my domain registrar's nameserver or find an alternative?

    - by Fazal
    I've recently been moving my client's sites as well as my own and my friends to a cloud server instance I've set up. I don't have a nameserver setup on my instance because I'm not sure how to deploy and manage that side of things yet. I'm using the default nameservers where possible for the sites and just changing the A record DNS settings to point at the server. Some clients are complaining that the sites are running slower then before (since I changed nameservers back to the defaults). Some of the domain registrars are a nightmare to deal with and I can't convince some of my clients to leave them. Is there a sort of paid service I can use instead?

    Read the article

  • Google I/O 2012 - Fast UIs for the Cross-Device Web

    Google I/O 2012 - Fast UIs for the Cross-Device Web Boris Smus One of the great features of the modern web is that sites work on any device with a browser. This session will focus on creating UIs for the cross-device web. We will cover building web sites that support multiple device form factors (responsive and non-responsive approaches), discuss single page sites and some of the layout features in modern mobile browsers, and do a deep dive into multi-touch input on the web. Finally, we'll show some of the awesome new mobile debugging tools in Chrome and Chrome for Android. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 105 3 ratings Time: 49:31 More in Science & Technology

    Read the article

  • Earning extra cash as a programmer

    - by Anon
    I work as fulltime programmer and have a pretty well paid job for the country where I live, but I could do with a bit of extra cash at the moment (wife nagging about new kitchen etc.). I'd be interested in taking on small projects in my spare time. I'm not interested in writing malware or get rich quick schemes. I've checked out a few sites programmer freelance sites, but the projects all see to be very poorly paid or people that want malware creating (or both). Are there any good freelancing sites that I may have missed? Are there any other ways to find small freelance projects?

    Read the article

  • Is my webhost infected? [on hold]

    - by Svein Erik
    I have 2 websites, and on both websites, randomly on time, get's redirected automatically to porn sites. I can't figure out if it's something in the code or if the webhost is infected somehow. The two sites is: http://www.storkas.com/vm and http://www.prowebdesign.no/vm. Where should i start to find out what's happening..? I scanned the sites on hxxp://sitecheck.sucuri.net. One odd thing i saw there, is a reference to: hxxp://js.nohealth.org/js/jquery-1.1.js. I do not have this in my html code..

    Read the article

  • Issues With IIS Hosting Two Domains From Same Folder [closed]

    - by Bob Mc
    I have two different domain names that resolve to the same ASP.Net site. Both domains are hosted on the same server, which runs Windows Server 2003 and IIS6. The sites are differentiated in IIS Manager using host headers. However, both of the sites point to the same folder on the local drive for the site's page files. I am occasionally experiencing an ASP.Net error that says "The state information is invalid for this page and might be corrupted." I'm the site developer so I've addressed all the relevant code-related causes for this issue. However, I was wondering whether having two domains/sites sharing the same folder for an ASP.Net application might be causing this intermittent error. Also, is this generally a bad practice? Should I make separate, duplicate folders for each of the domains? Seems like that can become a maintenance headache.

    Read the article

  • Ubercart for 'serious' ecommerce?

    - by user793011
    Im big Drupal fan, and some very high profile sites use Drupal which also suggests that its a good system. Can the same be said for Ubercart? Ive used it for some small e commerce sites purely because I know Drupal. It seems to lack some basic features and also relies on javascript. I dont know of any high profile e commerce sites using it. My productivity would take a big hit if I had to learn a new e commerce platform and also if I couldn't use my favorite CMS with it, but do I have a choice? If not I guess I can hope that Drupal 7's Commerce module will be an improvement. Thanks

    Read the article

  • dedicated domain name VS just folders under a single domain?

    - by Ben Keating
    I run WordPress-Multisite for several sites. Each of these sites resolve under a single domain, e.g. example.com/foo/, example.com/bar/. I also have domain names for these e.g. foo.com, bar.com. which are currently redirects, so if a user hits foo.com, they are redirected (301) to example.com/foo/. My question is, should it be the other way around? should I use the dedicated domain names directly? What are the pros/cons of putting multiple sites under a single domain vs their own dedicated domains. I guess im asking with SEO and findability in mind.

    Read the article

  • Google ajuste son algorithme pour contrer spams et contenus dupliqués et rejette les critiques sur la qualité de ses résultats

    Google ajuste son algorithme pour contrer spams et contenus dupliqués Et rejette les critiques sur la qualité de ses résultats En réponse aux critiques acerbes ayant récemment remis en question la qualité de ses résultats de recherche, Google entreprend de mettre à jour son algorithme pour contrer les sites de spam et de contenu dupliqué. Ces mises à jour devraient s'attaquer aux sites qui fraudent ou enfreignent les lignes de conduite tracées par Google et ce dans le but de s'accaparer les premières places des résultats. Sont particulièrement visés ceux qui copient le contenu des autres sites et/ou qui disposent d'un taux faible en contenu original. Matt Cutts, en ...

    Read the article

  • Wordpress Multisite (Subfolders) - Google Analytics Tracking

    - by mmundiff
    I have a Wordpress multisite subfolder instance that I would like to track via Google Analytics. I guess optimally this would be a plugin which I could track each site in two places The Main Tracking Code which totals all traffic from the Multisite instance The Individual Site tracking code to see how each site specifically is doing. I think this plugin would have worked for me if I had a subdomain multisite instance: http://wordpress.org/extend/plugins/google-analytics-multisite-async/installation/ I know I can manually place the dual tracking code (http://www.markinns.com/articles/full/adding_two_google_analytics_accounts_to_one_page) but that would involve editing a theme and I have multiple sites using TwentyEleven template. I don't think I can edit the theme and not have it wreak havoc on the rest of the sites using TwentyEleven. So has anyone done this? Is there a a technique I'm missing? Is there a plugin available to do this in Multisite Subfolder installations? Is there a way to manually insert GA codes into themes which are used by multiple sites? Any insight is appreciated.

    Read the article

  • Mozilla lance la bêta de Persona, son système d'authentification centralisée pour le Web qui met fin aux identifiants et mots de passe

    Mozilla lance la bêta de Persona son système d'authentification centralisée pour le Web, qui met fin aux identifiants et mots de passe Mozilla vient de lancer la version beta de « Persona », son nouveau système d'authentification pour le Web. Persona est un moyen d'identification qui élimine les identifiants et mots de passe sur les sites Web, tout en étant sécurisé et facile à utiliser. Le but du projet est de permettre aux utilisateurs de s'identifier sur différents sites sans mot de passe spécifique et sans avoir recours à des services d'authentification centralisée comme Facebook ou OpenID, et aux sites Web de ne plus s'inquiéter sur la sécurité des mots de passe.

    Read the article

  • Microsoft serait plus populaire que Google, selon les dernières estimations de comScore

    Microsoft serait plus populaire que Google, selon les dernières estimations de comScore S'il est un sujet qui divise souvent, après le Microsoft vs. Apple, c'est le Microsoft vs. Google. Et les statistiques publiées ce jour par comScore risquent de relancer le débat. Car si Google est toujours le moteur de recherche le plus utilisé avec le plus grand nombre de visiteurs uniques, les sites de Microsoft le battent en matière de popularité. Ces chiffres proviennent de données issues de l'étude du Web anglais, et des propriétés détenues par chaque firme. Donc, tous les sites et services associés des deux géants ont été pris en compte, ce qui a plutôt changé la donne. Lorsqu'on ajoute Bing aux autres sites ...

    Read the article

  • Firefox 13 capture trop d'écrans pour sa nouvelle page d'historique, les miniatures des contenus consultés en HTTPS y sont affichées

    Firefox 13 capture trop d'écrans pour sa nouvelle page d'historique Les miniatures des contenus consultés en HTTPS y sont affichées Grosse gaffe dans Firefox 13. La version la plus récente du navigateur sortie au début du mois enregistre en effet tous les sites visités par l'utilisateur, avec capture d'écran à l'appui, pour personnaliser sa page d'accueil. Problème, dans « tous », il y a également les sites sécurisés en HTTPS. Dans sa nouvelle page ? proposé par défaut à l'ouverture d'un nouvel onglet ? Firefox 13 affiche des miniatures des sites pour rendre le surf plus rapide et plus pratique. Mais ces min...

    Read the article

  • Fusion Middleware 11gR1 : 2012?6??????

    - by Hiro
    2012?6? (2012/06/19 ??)?Fusion Middleware 11gR1 ?????????????? ? ????????????2??????? 1. Oracle WebCenter SitesFatWire???????Oracle WebCenter Sites 11.1.1.6.0 ????????????FatWire Software??????????????????????????????????"Oracle WebCenter Sites"?????????????????Fusion Middleware????? 11g Release 1 ??????????? Oracle WebCenter Sites 11.1.1.6.0 ?????????????????????HP-UX????????????????????HP-UX??????????FatWire???????????????? ????????????????????Release Notes (??)?????????? ???????????????AIX, Linux x86, Linux x86-64, Solaris (SPARC), Windows (32-bit), Windows x64 ?????? 2. Oracle JRockit, JRE/JDK??????Oracle JRockit, JRE/JDK????????????????? Oracle JRockit R28.2.3 Oracle JRE/JDK 6 Update 32 Oracle JRE/JDK 7 Update 4 ? ??????????????

    Read the article

  • Web developing- Strange happenings

    - by Jason
    As I'm teaching myself PHP and MySQL during break, I'm experimenting coding in a Ubuntu virtual machine where Apache, MySQL and PHP have been installed and configured to a shared folder. I'm not a big fan of Kompozer because the source code layout is a PIA, so I've started checking out gPHPEdit. However, since using it, I've come across two issues: when I edit the .html and .php files, sometimes the file extension will change to .html~ and .php~, becoming invisible to the browser. The only solution is to switch to Windows, right click and rename the file extension. In Ubuntu Firefox, when I click on my prpject's Submit button for in a practice form, a dialog box pops up asking what Firefox should do with the .php file, rather than simply displaying it in the browser. When I do this in Windows Chrome & Firefox, it goes right to the response page. I'm not sure if this behavior is limited to gPHPEdit/Kompozer, but I've never noticed this happening in Dreamweaver. Any solutions? EDIT The behavior in Point 1 occurs both when Dreamweaver is open in Windows accessing the same files and when it is not. I changed the extension filename of welcome.php, added a comment in gPHPEdit, and the file changed to welcome.php~ upon saving.

    Read the article

  • Cannot Start Nginx Compiled from Source

    - by Jason Alan Kennedy
    I am trying to compile Nginx from source based on the original compiled Nginx server running on my DigitalOcean server ( Ubuntu-14.04 64x ) but with a few extra modules. I can get everything installed smoothly but I can not get it to start. I am sure the ini is correct because I copied the original source off the current running Nginx server [ Even though I see that Nginx now adds the ini when compiling fron source ]. Below is the [ lengthy process ] that I am performing - add sorry but I wanted to be thorough for those who are in need of the info ]. Because I am a newB to Nginx, I am sure I am missing something or just have it all wrong. If you may look over what I have done and see if you spot anything I need/need to change, I will greatly appreciate it. Thnx! With the original Nginx server still running: I check the current/running Nginx configuration so I can build the new Nginx instance the same but with the added modules: nginx -V # The out-put: configure arguments: --with-cc-opt='-g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2' --with-ld-opt='-Wl,-Bsymbolic-functions -Wl,-z,relro' --prefix=/usr/share/nginx --conf-path=/etc/nginx/nginx.conf --http-log-path=/var/log/nginx/access.log --error-log-path=/var/log/nginx/error.log --lock-path=/var/lock/nginx.lock --pid-path=/run/nginx.pid --http-client-body-temp-path=/var/lib/nginx/body --http-fastcgi-temp-path=/var/lib/nginx/fastcgi --http-proxy-temp-path=/var/lib/nginx/proxy --http-scgi-temp-path=/var/lib/nginx/scgi --http-uwsgi-temp-path=/var/lib/nginx/uwsgi --with-debug --with-pcre-jit --with-ipv6 --with-http_ssl_module --with-http_stub_status_module --with-http_realip_module --with-http_addition_module --with-http_dav_module --with-http_geoip_module --with-http_gzip_static_module --with-http_image_filter_module --with-http_spdy_module --with-http_sub_module --with-http_xslt_module NOTE: The configure arguments below return errors during 'make' so I removed them. I don't know what they are - could this be related to my issue??? --with-cc-opt='-g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2' --with-ld-opt='-Wl,-Bsymbolic-functions -Wl,-z,relro' Moving on: # So I don't have to sudo every line: sudo bash # Check for updates first thing: apt-get update # Install various prerequisites needed to compile Nginx: apt-get install build-essential libgd2-xpm-dev lsb-base zlib1g-dev libpcre3 libpcre3-dev libbz2-dev libxslt1-dev libxml2 libssl-dev libgeoip-dev tar unzip openssl # Create System users [ if it doesn't exist - but I see its there on DigitalOceans' Droplets all-ready ]: adduser --system --no-create-home --disabled-login --disabled-password --group www-data # Download NGINX wget http://nginx.org/download/nginx-1.7.4.tar.gz tar -xvzf nginx-1.7.4.tar.gz # Then Google PageSpeed: wget https://github.com/pagespeed/ngx_pagespeed/archive/release-1.8.31.4-beta.zip unzip release-1.8.31.4-beta.zip # cd into the PageSpeed Directory cd ngx_pagespeed-release-1.8.31.4-beta/ # and add the PSOL files in there: wget https://dl.google.com/dl/page-speed/psol/1.8.31.4.tar.gz tar -xzvf 1.8.31.4.tar.gz # Get back to the root directory: cd # I add the ngx_cache_purge module and will install the Nginx Helper plugin for WP later: wget https://github.com/FRiCKLE/ngx_cache_purge/archive/2.1.zip unzip 2.1.zip # Add the headers-more-nginx-module: wget https://github.com/openresty/headers-more-nginx-module/archive/v0.25.zip unzip v0.25.zip # and the naxsi module for added security: wget https://github.com/nbs-system/naxsi/archive/0.53-2.tar.gz tar -xvzf 0.53-2.tar.gz # cd to the new Nginx directory cd nginx-1.7.4 # Set up the configuration build based on the current running Nginx config args and add my additional modules: ./configure \ --add-module=$HOME/naxsi-0.53-2/naxsi_src \ --prefix=/usr/share/nginx \ --conf-path=/etc/nginx/nginx.conf \ --http-log-path=/var/log/nginx/access.log \ --error-log-path=/var/log/nginx/error.log \ --lock-path=/var/lock/nginx.lock \ --pid-path=/run/nginx.pid \ --http-client-body-temp-path=/var/lib/nginx/body \ --http-fastcgi-temp-path=/var/lib/nginx/fastcgi \ --http-proxy-temp-path=/var/lib/nginx/proxy \ --http-scgi-temp-path=/var/lib/nginx/scgi \ --http-uwsgi-temp-path=/var/lib/nginx/uwsgi \ --user=www-data \ --group=www-data \ --with-debug \ --with-pcre-jit \ --with-ipv6 \ --with-http_ssl_module \ --with-http_stub_status_module \ --with-http_realip_module \ --with-http_addition_module \ --with-http_dav_module \ --with-http_geoip_module \ --with-http_gzip_static_module \ --with-http_image_filter_module \ --with-http_spdy_module \ --with-http_sub_module \ --with-http_xslt_module \ --with-mail \ --with-mail_ssl_module \ --add-module=$HOME/ngx_pagespeed-release-1.8.31.4-beta \ --add-module=$HOME/ngx_cache_purge-2.1 \ --add-module=$HOME/headers-more-nginx-module-0.25 [ENTER] Configuration Summary: Configuration summary + using system PCRE library + using system OpenSSL library + md5: using OpenSSL library + sha1: using OpenSSL library + using system zlib library nginx path prefix: "/usr/share/nginx" nginx binary file: "/usr/share/nginx/sbin/nginx" nginx configuration prefix: "/etc/nginx" nginx configuration file: "/etc/nginx/nginx.conf" nginx pid file: "/run/nginx.pid" nginx error log file: "/var/log/nginx/error.log" nginx http access log file: "/var/log/nginx/access.log" nginx http client request body temporary files: "/var/lib/nginx/body" nginx http proxy temporary files: "/var/lib/nginx/proxy" nginx http fastcgi temporary files: "/var/lib/nginx/fastcgi" nginx http uwsgi temporary files: "/var/lib/nginx/uwsgi" nginx http scgi temporary files: "/var/lib/nginx/scgi" Next step: I cd to root and I check the old Nginx folder locations and double checked the 'make' output to see that they are the same: whereis nginx #Output: nginx: /usr/sbin/nginx /etc/nginx /usr/share/nginx NOTE: Not sure about the '/usr/sbin/nginx' - Possible issue??? Next I copy the old /etc/nginx/nginx.conf, /etc/nginx/sites-available/default, /etc/nginx/sites-enabled/default, /etc/init.d/nginx to a text file locally for safe keeping to use in the new Nginx server. Then stop the running Nginx server: service nginx stop , verify it's stopped: service --status-all and the output is: [ - ] nginx To verify that there are two Nginx directories, I cd to: cd nginx* and the output is an error indicating there are two nginx folders - Cool Beans! :) Now Install the new Nginx server: cd nginx-1.7.4 make install # INSTALL OUTPUT ######################################## make -f objs/Makefile install make[1]: Entering directory `/home/walkingfish/nginx-1.7.4' test -d '/usr/share/nginx' || mkdir -p '/usr/share/nginx' test -d '/usr/share/nginx/sbin' || mkdir -p '/usr/share/nginx/sbin' test ! -f '/usr/share/nginx/sbin/nginx' || mv '/usr/share/nginx/sbin/nginx' '/usr/share/nginx/sbin/nginx.old' cp objs/nginx '/usr/share/nginx/sbin/nginx' test -d '/etc/nginx' || mkdir -p '/etc/nginx' cp conf/koi-win '/etc/nginx' cp conf/koi-utf '/etc/nginx' cp conf/win-utf '/etc/nginx' test -f '/etc/nginx/mime.types' || cp conf/mime.types '/etc/nginx' cp conf/mime.types '/etc/nginx/mime.types.default' test -f '/etc/nginx/fastcgi_params' || cp conf/fastcgi_params '/etc/nginx' cp conf/fastcgi_params '/etc/nginx/fastcgi_params.default' test -f '/etc/nginx/fastcgi.conf' || cp conf/fastcgi.conf '/etc/nginx' cp conf/fastcgi.conf '/etc/nginx/fastcgi.conf.default' test -f '/etc/nginx/uwsgi_params' || cp conf/uwsgi_params '/etc/nginx' cp conf/uwsgi_params '/etc/nginx/uwsgi_params.default' test -f '/etc/nginx/scgi_params' || cp conf/scgi_params '/etc/nginx' cp conf/scgi_params '/etc/nginx/scgi_params.default' test -f '/etc/nginx/nginx.conf' || cp conf/nginx.conf '/etc/nginx/nginx.conf' cp conf/nginx.conf '/etc/nginx/nginx.conf.default' test -d '/run' || mkdir -p '/run' test -d '/var/log/nginx' || mkdir -p '/var/log/nginx' test -d '/usr/share/nginx/html' || cp -R html '/usr/share/nginx' test -d '/var/log/nginx' || mkdir -p '/var/log/nginx' ######################################################### I copy/create the files that I saved earlier to txt files in sites-available, the config, default and ini files then symlink them to sites-enabled, and so on. And now to start the server: service nginx start And this is where s#!+ hits the fan - Nada. I check to see if Nginx is running with service --status-all and its not. Also with nginx -V and its not installed??? I reboot the system too and still nothing. So I am not sure what is wrong here. The ini was copied over from the old server along with all the other config files after deleting the old files. When I opened the new compiled files, the nginx default data was present so I replaced them with my old original data prior to starting the new server for the first time. Also to be safe, I rm /etc/nginx/sites-enabled/default and symlinked with ln -s /etc/nginx/sites-available/default /etc/nginx/sites-enabled/default with no errors and I verified that the data was in the sites-enabled/default file. I don't think the server really/fully installed because of the nginx -V result: The program 'nginx' can be found in the following packages: * nginx-core * nginx-extras * nginx-full * nginx-light * nginx-naxsi Try: apt-get install <selected package> Do/should I apt-get install nginx-1.7.4 ?? Or what package do I use being that its a custom package and make install earlier did nothing?? If you need to see the conf files I copied over from the old to the custom server, LMK and I'll post them. Again your help here would be appreciated!

    Read the article

  • A Few of My Favorite HTML5 and CSS3 Online Tools

    - by dwahlin
    I really enjoy coding up HTML5, CSS3, and JavaScript applications but there are some things that I’m better off writing with the help of a development tool. For example, CSS3 gradients aren’t exactly the most fun thing to write by hand and the same could be said for animations, transforms, or styles that require various vendor extensions. There are a lot of online tools that can simplify building HTML5/CSS3 sites and increase productivity in the process so I thought I’d put together a post on a few of my favorites tools. HTML5 Boilerplate HTML5 Boilerplate provides a great way to get started building HTML5 sites. It includes many best practices out of the box and even includes a few tricks that many people don’t even know about. The custom download option allows you to pick the features that you want to include in the files that’s generated. You can read more about it here.   Initializr Although HTML5 Boilerplate provides a great foundation for starting HTML5 sites, it focuses on providing a starting shell structure (namely an html page, JavaScript files, and a CSS stylesheet) and doesn’t include much in the way of page content to get started with. Initializer builds on HTML5 Boilerplate and provides an initial test page that can be tweaked to meet your needs. It also provides several different customization options to include/exclude features. CSS3 Maker CSS3 provides a lot of great features ranging from gradient support to rounded corners. Although many of the features are fairly straightforward there are some that are pretty involved such as gradients, animations, and really any styles that require custom vendor extensions to use across browsers. Sure, you can type everything by hand, but sites such as CSS3 Maker provide a visual way to generate CSS3 styles. CSS3, Please! CSS3, Please! is a code generation tool that can be used to generate cross-browser CSS3 styles quickly and easily. All of the main things you can do with CSS3 are available including a clever way to visually generate CSS3 transform styles.       Ultimate CSS Gradient Generator CSS3 Maker (above) has a gradient generator built-in but my favorite tool for creating CSS3 gradients is the Ultimate CSS Gradient Generator. If you’ve created gradients in tools like Photoshop then you’ll love what this tool has to offer especially since it makes it extremely straightforward to work with different gradient stops. @font-face Fonts Although @font-face has been available for awhile, I think fonts are cool and wanted to mention a site that provides a lot of font choices. When used correctly fonts can really enhance a page and when used incorrectly (think Comic Sans) they can absolutely ruin a page. Several sites exist that provide fonts that can be used with @font-face definitions in CSS style sheets. One of my favorites is Font Squirrel.   HTML5 & CSS3 Support and Tests Interested in knowing what HTML5 and CSS3 features a given browser supports? Want to know how various browsers stack up with each other as far as HTML5/CSS3 support. Look no further than the HTML5 & CSS3 Support page or the HTML5 Test page.   CSS3 Easing Animation Tool CSS3 animations aren’t widely supported across browsers right now (I’m not really using them at this point) but they do offer a lot of promise. Creating easings for animations can definitely be a challenge but they’re something that are critical for adding that “professional touch” to your animations. Fortunately you can use the Ceaser CSS Easing Animation Tool to simplify the process and handle animation easing with…...ease.   There are several other online tools that I like but these are some of the ones I find myself using the most. If you have any favorite online tools that simplify working with HTML5 or CSS3 let me know.     For more information about onsite or online training, mentoring and consulting solutions for HTML5, jQuery, .NET, SharePoint or Silverlight please visit http://www.thewahlingroup.com.

    Read the article

  • Webcast Q&A: ResCare Solves Content Lifecycle Challenges with Oracle WebCenter

    - by Kellsey Ruppel
    Last week we had the fourth webcast in our WebCenter in Action webcast series, "ResCare Solves Content Lifecycle Challenges with Oracle WebCenter", where customer Joe Lichtefeld from ResCare and Wayne Boerger & Doug Thompson from Oracle Partner TEAM Informatics shared how Oracle WebCenter is powering allowing ResCare to solve content lifecycle challenges, reduce compliance and business risks, and increase adoption of intranet as primary business communication tool In case you missed it, here's a recap of the Q&A.   Joe Lichtefeld, ResCare  Q: Did you run into any issues in the deployment of the platform?A: We experienced very few issues when implementing the content management and search functionalities. There were some challenges in determining the metadata structure. We tried to find a fine balance between having enough fields to provide the functionality needed, but trying to limit the impact to the contributing members.  Q: What has been the biggest benefit your end users have seen?A: The biggest benefit to date is two-fold. Content on the intranet can be maintained by the individual contributors more timely than in our old process of all requests being updated by IT. The other big benefit is the ability to find the most current version of a document instead of relying on emails and phone calls to track down the "current" version. Q: Was there any resistance internally when implementing the solution? If so, how did you overcome that?A: We experienced very little resistance. Most of our community groups were eager to be able to contribute and maintain their information. We had the normal hurdles of training and follow-up training with implementing a new system and process. As our second phase rolled out access to all employees, we have received more positive feedback on the accessibility of information. Wayne Boerger & Doug Thompson, TEAM Informatics Q: Can you integrate multiple repositories with the Google Search Appliance? Yes, the Google Search Appliance is designed to index lots of different repositories, from both public and internal sources. There are included connectors to many repositories, such as SharePoint, databases, file systems, LDAP, and with the TEAM GSA Connector and the Oracle Content Server. And the index for these repositories can be configured into different collections depending on the use cases that each customer has, and really, for each need within a customer environment. Q: How many different filters can you add when the search results are returned? A: Presuming this question is about the filtering on the search results. You can add as many filters as you like and it can be done by collection or any number of other criteria. Most importantly, customers now have the ability to limit the returned content by a set metadata value. Q: With the TEAM Sites Connector, what types of content can you sync? A: There’s really no limit; if it can be checked into the content server, then it is eligible for sync into Sites.  So basically, any digital file that has relevance to a Sites implementation can be checked into the WC Content central repository and then the connector can/will manage it. Q: Using the Connector, are there any limitations around where in Sites that synced content can be used? A: There are no limitations about where it can be used. When setting up your environment to use it, you just need to think through the different destinations on the Sites side that might use the content; that way you’ve got the right information to create the rules needed for the connector. If you missed the webcast, be sure to catch the replay to see a live demonstration of WebCenter in action!  ResCare Solves Content Lifecycle Challenges with Oracle WebCenter from Oracle WebCenter

    Read the article

  • SphinxSearch or a spider - which one to choose?

    - by r2b2
    Hello, here is my problem: We own SiteA and SiteB and they share the same server and database where we have full control. SiteC , siteD and siteE are some of the sites we own as well but reside on a different web hosts. The goal is to create a unified search functionality for all of the sites mentioned above. That is if somebody search for a term in SiteA, the search result will automatically come up with results from SiteB,SiteC,SiteD and Site E too. The search results should be shown under the website they were found in. All these websites content are stored in their own databases. If I use SphinxSearch to index the above sites,I would then require those sites that we dont have complete control with to setup a web service where i can download a database dump or csv file for indexing. Im not quite sure about how a sphider will come into play here so need your opinion. Sphinx or a spider? THanks!

    Read the article

  • Having Issues with Curb gem on Mac Snow Leopard

    - by forgotpw1
    This has consumed hours of my time. in the console i run: require 'curb' i get the error: LoadError: dlopen(/usr/local/lib/ruby/gems/1.8/gems/taf2-curb-0.5.4.0/lib/curb_core.bundle, 9): no suitable image found. Did find: /usr/local/lib/ruby/gems/1.8/gems/taf2-curb-0.5.4.0/lib/curb_core.bundle: mach-o, but wrong architecture - /usr/local/lib/ruby/gems/1.8/gems/taf2-curb-0.5.4.0/lib/curb_core.bundle from /usr/local/lib/ruby/gems/1.8/gems/taf2-curb-0.5.4.0/lib/curb_core.bundle from /usr/local/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `require' from /Users/user/Sites/CSG/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/user/Sites/CSG/vendor/rails/activesupport/lib/active_support/dependencies.rb:521:in `new_constants_in' from /Users/user/Sites/CSG/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /usr/local/lib/ruby/gems/1.8/gems/taf2-curb-0.5.4.0/lib/curb.rb:1 from /usr/local/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:36:in `gem_original_require' from /usr/local/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:36:in `require' from /Users/user/Sites/CSG/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/user/Sites/CSG/vendor/rails/activesupport/lib/active_support/dependencies.rb:521:in `new_constants_in' from /Users/user/Sites/CSG/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from ./lib/tokbox/base_api.rb:7 I have tried uninstalling the gem and reinstalling a number of versions with ARCHFLAGS="-arch i386" No errors or warnings are given in the install When i try and install with: rake install I get this error as well. I am working on a mac ox 10.6 with ruby 1.8 i notice there are libcurl.4.dylib, libcurl.3.dylib, and libcurl.2.dlib and libcurl.dylib in my /usr/lib folder... I did an install of the newest 7.20 curl package. I have tried to install from the source as well and get this error localhost:taf2-curb-ac0b465 user$ rake install (in /Users/user/Downloads/taf2-curb-ac0b465) /Users/user/Downloads/taf2-curb-ac0b465/ext/curb_core.bundle: dlopen(/Users/user/Downloads/taf2-curb-ac0b465/ext/curb_core.bundle, 9): no suitable image found. Did find: (LoadError) /Users/user/Downloads/taf2-curb-ac0b465/ext/curb_core.bundle: mach-o, but wrong architecture - /Users/user/Downloads/taf2-curb-ac0b465/ext/curb_core.bundle from /Users/user/Downloads/taf2-curb-ac0b465/lib/curb.rb:1 from /Users/user/Downloads/taf2-curb-ac0b465/tests/helper.rb:12:in `require' from /Users/user/Downloads/taf2-curb-ac0b465/tests/helper.rb:12 from ./tests/tc_curl_download.rb:1:in `require' from ./tests/tc_curl_download.rb:1 from /usr/local/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `load' from /usr/local/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 from /usr/local/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `each' from /usr/local/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 rake aborted! Command failed with status (1): [/usr/local/bin/ruby -I"lib" "/usr/local/li...] Suggestions?

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >