Search Results

Search found 35343 results on 1414 pages for 'development tools'.

Page 87/1414 | < Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >

  • Advise on how to move from a .net developer role to a web developer role

    - by dermd
    I've been working primarily as a .net developer for the past 4 years for a financial services company. I've worked on .net 1.1, 2.0, 3.5 and have done the 3.5 enterprise app developer cert (not that that's worth a whole lot!). Before that I worked as a java developer with a bit of Flex thrown in for just over a year. My educational background is an Electronic and computer engineering degree, a higher diploma in systems analysis as well as one in web development (this was mainly java - JSP, Spring, etc) and a science masters in software design and development. I really feel like a change and would like to move to a different field to experience something different. I've done some courses in RoR and played around with it a bit in my spare time. Similarly I've done various web and mobile courses and done up some mobile webapps along with android and ios equivalents (haven't tried pushing them up to the app stores yet but may be worth tidying them up and doing that). I currently work long enough hours so find it hard to find time to work on too many side projects to get a decent portfolio together. But when I do work on the web stuff I do find it really enjoyable so think it's something I'd like to do full time. However, since my experience is pretty much all .net and financial services I find it very hard to get my foot in the door anywhere or get past a phone screen unless their specifically looking for someone with .net knowledge. What is the best way to move into a web development role without starting from scratch again. I do think a lot of the skills I have translate over but I seem to just get paired with .net jobs whenever I look around? Apart from js, jquery, html5, objective C are there any other technologies I should be looking into?

    Read the article

  • .co.uk targeted for google.co.uk .com targeted for google.com

    - by Higgs Boson
    We've had a website running on a .co.uk domain for some years, this domain is listed in the SERPS for our brand on both google.co.uk and google.com. We get little traffic from anywhere other than the UK because the website is targeted at the UK market with specific UK keywords. This is great, however we recently purchased the .com domain with the intention of producing a second version of the website targeted to the United States with US specific keywords i.e. targeting and moving in to the US marketplace. We have used Google webmaster tools to set the geographic target for the .com domain to be the US. I think I was expecting ONLY the .com site to show up when searching google.com and only the .co.uk site to show up when searching google.co.uk. However when we search google.com for our 'brand' the .co.uk site is listed in the SERPS. We would prefer the .com to appear in the SERPS on google.com. Is there anything we can do?

    Read the article

  • MacGyver Moments

    - by KKline
    In case you haven't heard, your MacGyver Moments are those times when you improvised an excellent solution to a problem using non-traditional materials, techniques, or tools......(read more)

    Read the article

  • MacGyver Moments

    - by KKline
    In case you haven't heard, your MacGyver Moments are those times when you improvised an excellent solution to a problem using non-traditional materials, techniques, or tools......(read more)

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • 301 redirect, keyword being in bold

    - by seo-man
    Regarding 301; If I redirect nonwww to www domain with 301 redirect, do I still have to determinate inside google webmaster tools which version (www or nonwww) is prefered? Or is setting up redirect enough so therefore I don't need to determinate that inside GWT? Regarding keyword being in bold: Usually keywords are supposted to be in bold font and it is irrelevant if they are links or not. But in heading (h1, h2); does there keyword also need to be in bold or is it enough if I care to put it to the beginning of heading? So elsehow asked; Does the keyword in heading need to be in bold font also?

    Read the article

  • google changing crawl speed: doesn't seem to work. Why?

    - by Olivier Pons
    I've changed 3 days ago the google crawling speed of mywebsite. Here it is: This means: 2 demands by second. I've got the message on the google webmasters tools that the change speed has been taken in account: But after more than three days, nothing happens: still one request every ten seconds See here: My webserver is very fast and can handle up to twenty simultaneous connexions. And my website is brand new, this means google is almost the only one here crawling my website. After more than 30000 successful requests (= no 404), I think there's something going on... or maybe this is just a bug? Has anyone ever had this problem?

    Read the article

  • 3 language website using subdomains and mapped domains. Add subdomains or mapped domains to WMT?

    - by Owen Mclaughlin
    I have a new wordpress multisite setup. Main language Italian and 2 subdomains using en and de for english and german. There is no auto translation plugins being used. The wordpress theme being used is by Studiopress.com and have SEO built in. I am a little confused as which domains to use in Webmasters Tools. If I use the subdomains (en and de) which have the seo setup, then google will index and show the en.example.it wont know about the mapped domains or display them. If I use the mapped domains then won't google not see the seo for the subdomains. I am muddled with this. What do??

    Read the article

  • Weird 301 redirection by google crawler

    - by Ace
    I have some pages on my website www.acethem.com which are having 301 redirection but they are not actually 301 redirects. e.g. www.acethem.com/pastpapers/by-year/2007/ is seen as a 301 redirection to www.acethem.com/pastpapers/by-year by google (I am using "Fetch as google" in webmaster tools. Now more weird: My paginated pages with page = 10 are all redirected to homepage: http://www.acethem.com/pastpapers/o-level/chemistry/page/10/ while http://www.acethem.com/pastpapers/o-level/chemistry/page/9/ is working properly in google crawler. Note that all these pages work fine with no redirect in browsers. Sidenote: on www.acethem.com/pastpapers/by-year/2007/, the facebook share button also points to www.acethem.com/pastpapers/by-year/.

    Read the article

  • Visual NHibernate Update

    - by Ricardo Peres
    I have previously talked about Visual NHibernate. It has grown since last time, now offering support for multiple databases (SQL Server, Oracle, MySQL, PostgreSQL, Firebird), generates projects from existing databases or from existing Visual Studio projects and produces XML or Fluent mappings, to name just a few. To me it is by far the most interesting tools for working with NHibernate I know of (granted, I haven't tried NHibernate Profiler). For a limited period, Slyce Software is offering a 30% discount, until the final version is released, so you may want to have a look. Please note that I am in no way related to Slyce, but made some feature requests which have been implemented (thanks, Gareth!).

    Read the article

  • Robots.txt never downloaded but some blocked URLs in GWT

    - by Zistoloen
    There is something I don't understand in Google Webmaster Tools (GWT) for my Wordpress site. In menu "Blocked URLs", it mention that my robots.txt has never been downloaded but there are some blocked URLs. It's kind of weird and not logical. Am i missing something? User-agent : * Disallow: /*? Disallow: /wp-login.php Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content Allow: /wp-content/uploads Disallow: */trackback Disallow: /*/feed Disallow: /*/comments Disallow: /cgi-bin Disallow: /*.php$ Disallow: /*.inc$ Disallow: /*.gz$ Disallow: /*.cgi$ Disallow: /author/* I'm afraid my robots.txt doesn't block several URLs I want to block.

    Read the article

  • Detect frameworks and/or CMS utilized on websites in Firefox

    - by jkneip
    I'm redesigning the website for my academic library and am examining other sites to determine to identify the technologies used. Things like: Web frameworks Javascript frameworks Server-side technology Content management system Now I've had some real success in Firefox using plugins like Wappalyzer, Firebug, and the DOM Inspector. But some sites just don't display any of the info. I'm looking for using these tools, especially it seems it an enterprise-level CMS is being used. Does anyone know of any other tools to detect this kind of data? Also with Firebug & the DOM Inspector, there is a lot of info. displayed and I wondered if there was a way to derive the presence of server-side technologies, CMS's, etc. within certain elements of a web page? Also, if this question is more relevant to another Stack Exchange site, please let me know and I'll post it there instead. Much thanks, Jason

    Read the article

  • Leadership does not see value in standard process for machine configuration and new developer orientation

    - by opensourcechris
    About 3 months ago our lead web developer and designer(same person) left the company, greener pastures was the reason for leaving. Good for them I say. My problem is that his department was completely undocumented. Things have been tough since the lead left, there is a lot of knowledge both theoretical knowledge we use to quote new projects and technical/implementation knowledge of our existing products that we have lost as a result of his departure. My normal role is as a product manager (for our products themselves) and as a business analyst for some of our project based consulting work. I've taught myself to code over the past year and in an effort to continue moving forward I've taken on the task of setting my laptop up as a development machine with hopes of implementing some of the easier feature requests and fixing some of the no brainer bugs that get submitted into our ticketing system. But, no one knows how to take a fresh Windows machine and configure it to work seamlessly with our production apps. I have requested my boss, who is still in contact with the developer who left, ask them to document and create a process to onboard a new developer, software installation, required packages, process to deploy to the productions application servers, etc. None of this exists, and I'm spinning my wheels trying to get my computer working as a functional development machine. But she does not seem to understand the need for such a process to exist. Apparently the new developer who replaced the one who left has been using a machine that was pre-configured for our environment, so even the new developer could not set up a new machine if we added another developer. My question is two part: Am I wrong in assuming a process to on-board and configure a new computer to be part of our development eco-system should exist? Am I being a whinny baby and should I figure the process out and create a document on my own?

    Read the article

  • TDD - Outside In vs Inside Out

    - by Songo
    What is the difference between building an application Outside In vs building it Inside Out using TDD? These are the books I read about TDD and unit testing: Test Driven Development: By Example Test-Driven Development: A Practical Guide: A Practical Guide Real-World Solutions for Developing High-Quality PHP Frameworks and Applications Test-Driven Development in Microsoft .NET xUnit Test Patterns: Refactoring Test Code The Art of Unit Testing: With Examples in .Net Growing Object-Oriented Software, Guided by Tests---This one was really hard to understand since JAVA isn't my primary language :) Almost all of them explained TDD basics and unit testing in general, but with little mention of the different ways the application can be constructed. Another thing I noticed is that most of these books (if not all) ignore the design phase when writing the application. They focus more on writing the test cases quickly and letting the design emerge by itself. However, I came across a paragraph in xUnit Test Patterns that discussed the ways people approach TDD. There are 2 schools out there Outside In vs Inside Out. Sadly the book doesn't elaborate more on this point. I wish to know what is the main difference between these 2 cases. When should I use each one of them? To a TDD beginner which one is easier to grasp? What is the drawbacks of each method? Is there any materials out there that discuss this topic specifically?

    Read the article

  • What a web developer can learn [closed]

    - by knoxxs
    There are many things to learn in web development. You can easily find what are the most important thing that you need to learn if you want to be a webmaster. Answer to questions about how to become a web developer or a webmaster only contained limited items that someone need to master. (Some eg - a, b ) But the problem is that these resources are not complete. When I started learning web development i follow the same steps. But after learning the basic development I didn't know that I have learnt nothing, there are many more things to learn. I realized this by following blogs , Q&A sites. When I first downloaded the HTNL% Boilerplate, the issue that they have covered, some of them I haven't even heard about. I want you to just suggest what are the possible things, issues that someone can learn and why to learn. I know the answer is follow blogs and do your work you will learn with time, but with these platforms I could get some benefit out of other experiences. This question is not how to become a webmaster, but answer to this may also cover that too.

    Read the article

  • Recovery from URL structure change?

    - by Dejan Pelzel
    in July this year, we have changed the URL structure of the website from: Post: domain.com/blog/post/986/dance/heart-beats-dance-video-by-chinatsu/ Category: domain.com/blog/index/cosplay/ to Post: domain.com/dance/heart-beats-dance-video-by-chinatsu-986/ Category: domain.com/cosplay/ Everything was (supposedly) properly redirected with 301 redirects and it first seemed that the traffic returned after a couple of days, but it has now been close to 2 months and things keep going worse although Google is slowly indexing the changes. What is worrying me even more is that the Pages crawled per day from Webmaster Tools started drastically dropping a few days ago and has just reached a new low in months (from over 2000 to 700). Should I be worried or will things sort out eventually?

    Read the article

  • Detect frameworks and/or CMS utilized on websites in Firefox

    - by jkneip
    I'm redesigning the website for my academic library and am examining other sites to determine to identify the technologies used. Things like: Web frameworks Javascript frameworks Server-side technology Content management system Now I've had some real success in Firefox using plugins like Wappalyzer, Firebug, and the DOM Inspector. But some sites just don't display any of the info. I'm looking for using these tools, especially it seems it an enterprise-level CMS is being used. Does anyone know of any other tools to detect this kind of data? Also with Firebug & the DOM Inspector, there is a lot of info. displayed and I wondered if there was a way to derive the presence of server-side technologies, CMS's, etc. within certain elements of a web page? Also, if this question is more relevant to another Stack Exchange site, please let me know and I'll post it there instead. Much thanks, Jason

    Read the article

  • how to get IP adress of google search/crawler bot to add to our white list of ip address

    - by Jayapal Chandran
    Hi, Google webmaster tools says network unreachable. When i contacted my hosting provider they said that they have installed firewall which could block frequent incoming ip addresses and they dont know the google's ip adress to unblock. so they requested me to find google search/crawler bot's ip adress so that they can add it to their whitelist. How to find the ip address of google search bot or crawler bot? My site stopped appearing in google search. My hits had gone too low. What should i do? any kind of reply would he helpful.

    Read the article

  • Why is "www.mysite.com" different from "mysite.com"?

    - by sapeish
    In any browser if I use www.mysite.com or just mysite.com the web page is correctly retrieved, but I am having trouble with Google Analytics and Facebook App. Facebook: To be able to get Likes, I create the Facebook App needed and set the site URL to http://mysite.com/. Using their tool http://developers.facebook.com/tools/debug/ when I test my page using http://mypage.com it works but using http://www.mypage.com fails with the message: Object at URL 'http://www.mysite.com/' of type 'website' is invalid because the domain 'www.mysite.com' is not allowed for the specified application id. Google Analytics: To be able to get traffic statistics, I created a Property and a Profile both with the URL http://www.mypage.com and no statistics were gathered in a week, when I changed the configured URL to http://mypage.com statistics where available a few hours later. What should I do to have statistics and likes for both www.mysite.com and mysite.com ??

    Read the article

  • What's the best project management software for internal dev. 5 man shop

    - by P.Brian.Mackey
    I work for a large corporation, but we do small intranet web application development. Our project management tracking sucks. Its custom software built by a jr. intern. For what its worth, our development style is akin to agile, but there's nothing set in stone...very customer oriented approach. I need project tracking that meets the criteria: Intranet, internal products. Mostly maintenance, some new development. 5 developers 12 products 1 hands-off manager. He really just wants to know estimated man hours, due date for dev, QA and release. Along with a short description of the project. Free or super cheap. Bonus Simple pretty UI. Think pretty charts. Hope I covered everything. Please ask for any clarification. If you read dreaming in code, the company uses some project tracking software that sounds pretty sweet. Note, we do have Team Foundation Server. I already tried pushing its use as PM tracking, but its too complicated. I can't get people to sit and train. So this software has to be easy.

    Read the article

  • Why does Google report a soft 404 when I redirect to the signup page?

    - by Hettomei
    In the last month, I've got an increased number of "soft 404" errors reported by Google webmaster tools which actually work well for users. Configuration (maybe useless): I have a website built with rails 3.1 Authentication is handled by the gem Devise Problem: On this page http://en.bemyboat.com/yacht-charter/9965-sailboat-beneteau-oceanis-43 Click on "Ask a Boat request" (a simple form, in GET to: http://en.bemyboat.com/boat_requests/new/9965) You are redirected with the HTTP status 302 to sign in You are then sent back to the new page if successfully sign in. Google tells me that the link on "ask a boat request" returns a soft 404. I can't make this form in "POST" (which will solve the problem) because we need to automatically redirect users back to the page after sign in. (the Gem Devise memorizes the "get" link.) To simplify, the question is: How to protect a private page with authentication, reached with a simple "GET" and not to be penalized by Google as a "soft 404".

    Read the article

  • VS 2010: New Add Reference dialog, tab layout and options

    - by Fabrice Marguerie
    Microsoft has just published a new free extension for Visual Studio 2010 that provides an improved Add Reference dialog, an improved tab bar, and much more.The new Add Reference dialog comes with a long-awaited feature: it's now searchable!The tab bar allows you to display the close button at the end of the bar and not on each tab. It can also sort tabs by project and alphabetically. Tab color can vary by project or according to regular expressions.I'll let you discover about the other features by yourself (HTML Copy, Triple Click, Current Line Highlighting, etc.).The name of the extension is Visual Studio Pro Power Tools. I believe it's main features will come out-of-the-box with the next version of Visual Studio.

    Read the article

  • Create speed baseline for local web file

    - by Michael Jasper
    Is there any tool or method that will load a localhost page a number of times, and return the averaged data for load times, onload events, Dom ready events, etc? I'd like to work on page speed optimization, but need a baseline before I begin. I have used both Google analytics and Webmaster tools, but I'd like an automated solutions that runs locally. My ideal solution would be a program or script that would take the path/file, number of iterations, and then take several minutes to load the page n times without cache and crunch numbers to create a baseline.

    Read the article

  • Google not recognizing microdata? [duplicate]

    - by user1795832
    This question already has an answer here: How long for data highlighter mark up to appear in structured data tool? 2 answers I put in microdata to one page of a site I help manage using schema.org. Using the Google webmaster tool test, the page checks out and displays what it sees as the microdata properly. But when I go to the Structured Data page in webmaster tools, it keeps saying the site does not have any. I put it in 2 weeks ago. Us it just something that take a while for it to recognize? Or does microdata have to be on every page for it to be recognized or something?

    Read the article

  • How to evaluate SEO/prominence improvement [on hold]

    - by Rober
    I will work on a website SEO and before starting with it I would like to "take a snapshot" of the present status so that I will be able to compare it with the new situation in a few months and evaluate my work and the real improvement. I don't mean whether the website is well implemented or not, but how well it is seen by Google and others. What prominence it has. I am taking some variables from Google Analytics (average day visits...), from Google Webmaster Tools (Search traffic and average position...) and some other indicators, like automatic SEO audit figures (website estimated worth, real pagerank...). What would you look at before starting SEO improvement?

    Read the article

< Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >