Search Results

Search found 4271 results on 171 pages for 'mark bao'.

Page 23/171 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Move site to new domain divided by language across subdomains

    - by mark
    I managed to find a nice domain for a fairly fledgling site of mine that actually hasn't been parked by scumbag squatters. Given the upcoming move I'm thinking I'd take the opportunity to split the content across subdomains according to language, much like wikipedia for example: current: www.old-domain.com/en/subject # English www.old-domain.com/subjecto # Spanish (default so not locale in url) proposed en.new-domain.com/subject es.new-domain.com/subjecto The advantage of doing this is a fairly competitive keyword such that I may wish to put a copy of my application on a Spanish slice in order to gain a few serp's. Also pure vanity. Google's webmaster tools allows me to move to the new domain and I can add the root domain and the subdomains but forward to only one. I'll 301 from the old domain appropriately but is there anything I should know about webmaster tools in this respect where effectively I'm moving to two addresses? (Feel free to dissuade me from doing this if it's a bad idea in comments.)

    Read the article

  • What's going on with my wireless?

    - by Mark Scott
    The WiFi on my Acer laptop (it's a 3810TZ, with Intel Corporation Centrino Wireless-N 1000) works flawlessly on Ubuntu 11.04. On 11.10, it's continually up and down, and it fills the system log with messages such as those below. What is going on? It seems to be unable to decide which regulatory domain it's in. Despite the system configuration being quite clearly set to UK it persists in configuring itself as though it was opeating in Taiwan! Nov 22 15:34:37 MES3810 wpa_supplicant[1053]: WPA: 4-Way Handshake failed - pre-shared key may be incorrect Nov 22 15:34:37 MES3810 wpa_supplicant[1053]: CTRL-EVENT-DISCONNECTED bssid=00:50:7f:72:bf:b0 reason=15 Nov 22 15:34:37 MES3810 wpa_supplicant[1053]: CTRL-EVENT-DISCONNECTED bssid=00:00:00:00:00:00 reason=3 Nov 22 15:34:37 MES3810 kernel: [18239.240355] cfg80211: All devices are disconnected, going to restore regulatory settings Nov 22 15:34:37 MES3810 kernel: [18239.240362] cfg80211: Restoring regulatory settings Nov 22 15:34:37 MES3810 kernel: [18239.240368] cfg80211: Calling CRDA to update world regulatory domain Nov 22 15:34:37 MES3810 kernel: [18239.240408] wlan0: deauthenticating from 00:50:7f:72:bf:b0 by local choice (reason=3) Nov 22 15:34:37 MES3810 NetworkManager[875]: (wlan0): supplicant interface state: 4-way handshake - disconnected Nov 22 15:34:37 MES3810 kernel: [18239.246556] cfg80211: Ignoring regulatory request Set by core since the driver uses its own custom regulatory domain Nov 22 15:34:37 MES3810 kernel: [18239.246563] cfg80211: World regulatory domain updated: Nov 22 15:34:37 MES3810 kernel: [18239.246567] cfg80211: (start_freq - end_freq @ bandwidth), (max_antenna_gain, max_eirp) Nov 22 15:34:37 MES3810 kernel: [18239.246572] cfg80211: (2402000 KHz - 2472000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.246577] cfg80211: (2457000 KHz - 2482000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.246582] cfg80211: (2474000 KHz - 2494000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.246587] cfg80211: (5170000 KHz - 5250000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.246592] cfg80211: (5735000 KHz - 5835000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Nov 22 15:34:37 MES3810 NetworkManager[875]: (wlan0): supplicant interface state: disconnected - scanning Nov 22 15:34:37 MES3810 wpa_supplicant[1053]: Trying to authenticate with 00:50:7f:72:bf:b0 (SSID='PoplarHouse' freq=2412 MHz) Nov 22 15:34:37 MES3810 NetworkManager[875]: (wlan0): supplicant interface state: scanning - authenticating Nov 22 15:34:37 MES3810 kernel: [18239.509877] wlan0: authenticate with 00:50:7f:72:bf:b0 (try 1) Nov 22 15:34:37 MES3810 wpa_supplicant[1053]: Trying to associate with 00:50:7f:72:bf:b0 (SSID='PoplarHouse' freq=2412 MHz) Nov 22 15:34:37 MES3810 kernel: [18239.512276] wlan0: authenticated Nov 22 15:34:37 MES3810 kernel: [18239.512615] wlan0: associate with 00:50:7f:72:bf:b0 (try 1) Nov 22 15:34:37 MES3810 NetworkManager[875]: (wlan0): supplicant interface state: authenticating - associating Nov 22 15:34:37 MES3810 kernel: [18239.516508] wlan0: RX ReassocResp from 00:50:7f:72:bf:b0 (capab=0x431 status=0 aid=1) Nov 22 15:34:37 MES3810 kernel: [18239.516514] wlan0: associated Nov 22 15:34:37 MES3810 wpa_supplicant[1053]: Associated with 00:50:7f:72:bf:b0 Nov 22 15:34:37 MES3810 kernel: [18239.529097] cfg80211: Calling CRDA for country: TW Nov 22 15:34:37 MES3810 NetworkManager[875]: (wlan0): supplicant interface state: associating - associated Nov 22 15:34:37 MES3810 kernel: [18239.535680] cfg80211: Updating information on frequency 2412 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535688] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535692] cfg80211: Updating information on frequency 2417 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535697] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535702] cfg80211: Updating information on frequency 2422 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535707] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535711] cfg80211: Updating information on frequency 2427 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535716] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535720] cfg80211: Updating information on frequency 2432 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535725] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535730] cfg80211: Updating information on frequency 2437 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535735] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535739] cfg80211: Updating information on frequency 2442 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535744] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535748] cfg80211: Updating information on frequency 2447 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535753] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535757] cfg80211: Updating information on frequency 2452 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535763] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535767] cfg80211: Updating information on frequency 2457 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535772] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535777] cfg80211: Updating information on frequency 2462 MHz for a 20 MHz width channel with regulatory rule: Nov 22 15:34:37 MES3810 kernel: [18239.535782] cfg80211: 2402000 KHz - 2472000 KHz @ KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535786] cfg80211: Disabling freq 2467 MHz Nov 22 15:34:37 MES3810 kernel: [18239.535789] cfg80211: Disabling freq 2472 MHz Nov 22 15:34:37 MES3810 kernel: [18239.535794] cfg80211: Regulatory domain changed to country: TW Nov 22 15:34:37 MES3810 kernel: [18239.535797] cfg80211: (start_freq - end_freq @ bandwidth), (max_antenna_gain, max_eirp) Nov 22 15:34:37 MES3810 kernel: [18239.535802] cfg80211: (2402000 KHz - 2472000 KHz @ 40000 KHz), (300 mBi, 2700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535807] cfg80211: (5270000 KHz - 5330000 KHz @ 40000 KHz), (300 mBi, 1700 mBm) Nov 22 15:34:37 MES3810 kernel: [18239.535812] cfg80211: (5735000 KHz - 5815000 KHz @ 40000 KHz), (300 mBi, 3000 mBm) Nov 22 15:34:38 MES3810 NetworkManager[875]: (wlan0): supplicant interface state: associated - 4-way handshake Any ideas?

    Read the article

  • How does btrfs RAID work in degraded mode?

    - by turbo
    My idea was that (using loopback devices) it works like this Create the raid array sudo mkfs.btrfs -m raid1 -d raid1 /dev/loop1 /dev/loop2 You mount them sudo mount /dev/loop1 /mnt and mark them touch goodcondition You unmount and simulate disk failure (remove disk or delete loopback device loop2 in my case) You mount degraded -o degraded and mark again touch degraded You add the bad disk again sudo btrfs dev add /dev/loop2 You rebalance sudo btrfs fi ba /mnt And Raid 1 should work again. But that's not the case. sudo btrfs fi show: Total devices 3 FS bytes used 28.00KB devid 3 size 4.00GB used 264.00MB path /dev/loop1 devid 2 size 4.00GB used 272.00MB path /dev/loop2 *** Some devices missing The file degraded lives on loop1 but not on loop2 when loop2 is mounted in degraded mode. Why is that?

    Read the article

  • How do I set up multiple HDD?

    - by mark kirby
    I got some new hard discs and would like to set my PC up in the following way: Ubuntu is currently installed on one drive I want to put Windows on a second drive I would like a third drive for shared content (music and stuff) that both OSes can access What I need to know is what format should the content drive be? How should I configure the drive order in my bios for GRUB to be boot manager and how to configure GRUB for multi-HDD booting?

    Read the article

  • Upcoming Directory Services Live Webcast - Improve Time-to-Market and Reduce Cost with Oracle Direct

    - by mark.wilcox
    We're doing another live webcast on May 27 - Here's the details: Live Webcast: Improve Time-to-Market and Reduce Cost with Oracle Directory Services Event Date: Thursday, May 27, 2010 Event Time: 10:00 AM Pacific Standard Time / 1:00 Eastern Standard Time Organizations can spend up to 60% of their IT budgets on operational activities. • Are you being asked to do more, with less resources? • Have you had to lead a cost cutting exercise in your IT department? • Do you have licenses for software and wonder whether you are getting the most out of those resources? • Do you want to be an Identity Hero inside your organization? Oracle brings leadership in Directory Services to help organization's identify ways to leverage Oracle Virtual Directory to reduce costs in their enterprise. This presentation will explore ways to use Oracle Virtual Directory to federate faster, create architectures to meet aggressive time constraints for identity projects or mergers and acquisitions in a cost conscious environment. -- Posted via email from Virtual Identity Dialogue

    Read the article

  • Lessons From OpenId, Cardspace and Facebook Connect

    - by mark.wilcox
    (c) denise carbonell I think Johannes Ernst summarized pretty well what happened in a broad sense in regards to OpenId, Cardspace and Facebook Connect. However, I'm more interested in the lessons we can take away from this. First  - "Apple Lesson" - If user-centric identity is going to happen it's going to require not only technology but also a strong marketing campaign. I'm calling this the "Apple Lesson" because it's very similar to how Apple iPad saw success vs the tablet market. The iPad is not only a very good technology product but it was backed by a very good marketing plan. I know most people do not want to think about marketing here - but the fact is that nobody could really articulate why user-centric identity mattered in a way that the average person cared about. Second - "Facebook Lesson" - Facebook Connect solves a number of interesting problems that is easy for both consumer and service providers. For a consumer it's simple to log-in without any redirects. And while Facebook isn't perfect on privacy - no other major consumer-focused service on the Internet provides as much control about sharing identity information. From a developer perspective it is very easy to implement the SSO and fetch other identity information (if the user has given permission). This could only happen because a major company just decided to make a singular focus to make it happen. Third - "Developers Lesson" -  Facebook Social Graph API is by far the simplest API for accessing identity information which also is another reason why you're seeing such rapid growth in Facebook enabled Websites. By using a combination of URL and Javascript - the power a single HTML page now gives a developer writing Web applications is simply amazing. For example It doesn't get much simpler than this "http://api.facebook.com/mewilcox" for accessing identity. And while I can't yet share too much publicly about the specifics - the social graph API had a profound impact on me in designing our next generation APIs.  Posted via email from Virtual Identity Dialogue

    Read the article

  • Limited resource practice problems?

    - by Mark
    I'm applying for some big companies and the areas I seem to be getting burned on is problems involving limited memory, disk-space or throughput. These large companies process GBs of data every second (or more), and they need efficient ways of managing all that data. I have no experience with this as none of the projects I have worked on have grown that large. Is there a good place to learn about or practice these sorts of problems? Most of the practice-problem sites I've encountered only have problems where you have to solve something efficiently (usually involving prime numbers) but none of them limit your resources.

    Read the article

  • Software engineering and independence

    - by Mark
    I tend to think very independently, often coming up with unconventional, sometimes unorthodox, ways of solving problems. I do not like to listen to authority such as having to code up software a certain way or following strict guidelines/formats. Do you think the software engineering/development field would be very tough for someone like me who prefers autonomy? If not, what fields of computer science do allow for that?

    Read the article

  • Cool Tools You Can Use: Validation Templates for PeopleSoft Contracts Processes

    - by Mark Rosenberg
    This is the first in a series of postings we’ll be making under the heading of Cool Tools You Can Use. Our PeopleSoft product management team identified the need for this series after reflecting on the many conversations we have each year with our PeopleSoft community members. During these conversations, we were discovering that customers and implementation partners were often not aware that solutions exist to the problems they were trying to address and that the solutions were readily available at no additional charge. Thus, the Cool Tools You Can Use series will describe the business challenge we’ve heard, the PeopleSoft solution to the challenge, and how you can learn more about the solution so that everyone can be sure to make full use of what PeopleSoft applications have to offer. The first cool tool we’ll look at is the Validation Template for PeopleSoft Contracts Process Requests, which was first released in December 2013 as part of PeopleSoft Contracts 9.2 Update Image 4. The business issue our customers highlighted to us is the need to tightly control but easily configure and manage the scope of data that any user can process when initiating a process. Control of each user’s span of impact is essential to reducing billing reconciliation issues, passing span of authority audits, and reducing (or even eliminating) the frequency of unexpected process results.  Setting Up the Validation Template for a PeopleSoft Contracts Process With the validation template, organizations can easily and quickly ensure the software restricts the scope of transactions a user can affect and gives organizations the confidence to know that business processes are being governed effectively. Additionally, this control of PeopleSoft Contracts process requests can be applied and easily maintained and adjusted from a web browser thereby enabling analysts to administer the rules without having to engage software developers to customize the software. During the field validation template setup, an analyst specifies the combinations of fields that must contain values when a user tries to setup a run control and initiate a PeopleSoft Contracts process from a process request page. For example, for the Process Limits component, an organization could require that users enter a valid combination of values for the business unit, contract, and contract type fields or a value in the contract administrator field. Until the user enters a valid combination of entries on the process request page, he cannot launch the process. With the validation template activated for process request pages, organizations can be confident that PeopleSoft Contracts users will not accidentally begin generating invoices or triggering other revenue management processes for transactions beyond their scope of authority. To learn more about the Validation Template, please review the Defining Validation Templates section of the PeopleSoft Contracts PeopleBooks. 

    Read the article

  • .NET Libraries Cost More Than Windows?

    - by Kevin Mark
    When looking into libraries to make my programming life a little bit easier I've (almost) always been disappointed by the prices offered. For instance, Actipro's WPF Studio is $650. I suppose that's worth it if you plan to make money from the use of those controls. But take a look at, say, Windows. Windows 7 Ultimate is just about $220. I consider Windows to be a far more complex and "worth-it" product/purchase than a library that runs on it. Why the significant difference in pricing? Do libraries really need to be so expensive, or do they need to charge more in order to make a decent some of money?

    Read the article

  • cant get ubuntu to work with windows 8

    - by John Mark High
    ive been trying to dual boot Ubuntu with windows 8 but so far I haven't been able too. the laptop im using is a HP Pavilion g6-2240sa pre-installed with windows 8. ive made the bootbale USB with Ubuntu 12.10, it installs but when I restart the computer boot straight into windows, no grub boot options. I can get into Ubuntu once by doing an advanced restart and booting from the Ubuntu partition. I can use Ubuntu fine but once I restart or shutdown, I do the advanced restart again and the Ubuntu partition is now gone and I have to reinstall. i used this tutorial to install Ubuntu, http://www.youtube.com/watch?v=wNCSbTyUzoM After i have to reinstall and still no grub boot menu, i used the boot repair to re-install it. once i rebooted the computer it went straight to windows again and the Ubuntu partition was gone. can i dual boot windows 8 and unbuntu 12.10 with the grub so i can pick what OS to boot into when the computer is starting, and without the partition going AWOL???? Thanks in advance

    Read the article

  • Use Entitlements To Secure LDAP-enabled Applications With Oracle Virtual Directory and Oracle Entitl

    - by mark.wilcox
    I stumbled on an interesting article  that shows how the author used OVD to exposed OES security to protect a portal that only understood LDAP group-based authorization.This is great because it shows how you can use OES today to build central policies that can be used without needing to rewrite all of your applications - in particular if you just want to leverage rule-based groups.  Posted via email from Virtual Identity Dialogue

    Read the article

  • Zooming options terminology

    - by Mark
    I've come up with 4 different ways to fit an image inside a viewing region, but I'm trouble coming up with names for them. Perhaps someone can suggest some? Fit image in viewing region, do not enlarge if image is smaller Size image so it fits snuggly inside the viewing region (enlarge if necessary) -- the image is as large as possible while still fitting within the viewing region Size image so that it fills the entire viewing region -- the image will be the same size or bigger than the viewing region 1:1 ratio; 1 pixel in the image corresponds to 1 pixel on screen All zooming options maintain aspect ratio. Stretching is just ugly, so it's not an option :)

    Read the article

  • Correctly indexing multiple domains with same content in Google and others

    - by AJweb
    I have a client with a dozen territorial domains, like mydomain.co.uk, mydomain.fr, mydomain.de, etc Most of these domains hold a different language of the same dynamic content (shop), but some, like co.uk and .com, have the same language and content, except for some content customized to each country/domain in the front page, contact and other pages. I am aware that we should use the canonical meta tag to mark those duplicated contents, but, we want the co.uk to be present in UK ( indexed in google.co.uk ) and the .com to be present in US and other countries, for example, or least that is the goal. Is there anything we can do to "help" google determine the geographical meaning of each domain? If we mark with canonical tag the .com and co.uk sites, do you know how google will decide which one to show on a given search?

    Read the article

  • How to keep balance / Unlock items / achievement rules

    - by Mark Knol
    I'm working on an engine for a game, too learn javascript and just because its fun. I'm a flashdeveloper, I know how to build websites. Now making games is a different challenge, javascript is a challenge, but I'd love to learn how to structure code and what patterns are common. I dont mind if the game ever finish, I'm mostly interested in the programming part of it. I dont have a particular endresult in mind, so I'll see where it takes me. I currently have a system where you can buy items. The items cost a specified amount of gold, silver, diamonds etc. When you have selected and bought the item, it takes time before getting rewarded. When time is over, you are getting rewarded with other properties (gold, energy, diamonds). For example, you can buy an apple for 50gold, It takes a minute, you get rewarded with 75energy. Or if you take a run, it cost 50energy, it takes 5minutes, reward is 25gold and 25silver. These definitions is what i call actions. Currently I already have a system where this already works and I can define as much actions with as much properties as I want. The definitions I have kinda looks like this: {id:101, category:544, onInit:{gold:-75}, onComplete:{energy:75}, time:2000, name:"Apple", locked: false} {id:102, category:544, onInit:{gold:-135}, onComplete:{energy:145}, time:2000, name:"Banana", locked: false} {id:106, category:302, onInit:{energy:-50, power: -25}, onComplete:{gold:100, diamonds:2}, time:10000, name:"Run", locked: false} {id:107, category:302, onInit:{energy:-70, silver: -55}, onComplete:{gold:100}, time:10000, name:"Dance", locked: false} {id:108, category:302, onInit:{energy:-230, power: -355}, onComplete:{gold:70, silver:70}, time:10000, name:"Fitness", locked: false} Now, I would love to add a system where I can lock/unlock the actions using achievement rules. Lets say, if you buy 10 apples, you unlock a new action, like bananas which cost more, and reward more. In the future I maybe want to restrict achievements and actions to levels. I am kinda stuck how to structure this. I have 2 questions: Which patterns are used to define achievements? How/where are they defined? Should it be part of the action, or should it be a separate controller? Is it a good idea to register all completed actions to it? I think I want multiple types of achievement rules, Id love to hear some ideas how to develop it. How do you create/find a good balance, so the user does not get stuck or can cheat by repeat a pattern of actions to get too much rewards. I know there is not a simple answer and i'm lacking of a good game-concept, but I wonder if anyone created such a game and how you dealed and played with it.

    Read the article

  • SQL Server Developer Tools &ndash; Codename Juneau vs. Red-Gate SQL Source Control

    - by Ajarn Mark Caldwell
    So how do the new SQL Server Developer Tools (previously code-named Juneau) stack up against SQL Source Control?  Read on to find out. At the PASS Community Summit a couple of weeks ago, it was announced that the previously code-named Juneau software would be released under the name of SQL Server Developer Tools with the release of SQL Server 2012.  This replacement for Database Projects in Visual Studio (also known in a former life as Data Dude) has some great new features.  I won’t attempt to describe them all here, but I will applaud Microsoft for making major improvements.  One of my favorite changes is the way database elements are broken down.  Previously every little thing was in its own file.  For example, indexes were each in their own file.  I always hated that.  Now, SSDT uses a pattern similar to Red-Gate’s and puts the indexes and keys into the same file as the overall table definition. Of course there are really cool features to keep your database model in sync with the actual source scripts, and the rename refactoring feature is now touted as being more than just a search and replace, but rather a “semantic-aware” search and replace.  Funny, it reminds me of SQL Prompt’s Smart Rename feature.  But I’m not writing this just to criticize Microsoft and argue that they are late to the party with this feature set.  Instead, I do see it as a viable alternative for folks who want all of their source code to be version controlled, but there are a couple of key trade-offs that you need to know about when you choose which tool set to use. First, the basics Both tool sets integrate with a wide variety of source control systems including the most popular: Subversion, GIT, Vault, and Team Foundation Server.  Both tools have integrated functionality to produce objects to upgrade your target database when you are ready (DACPACs in SSDT, integration with SQL Compare for SQL Source Control).  If you regularly live in Visual Studio or the Business Intelligence Development Studio (BIDS) then SSDT will likely be comfortable for you.  Like BIDS, SSDT is a Visual Studio Project Type that comes with SQL Server, and if you don’t already have Visual Studio installed, it will install the shell for you.  If you already have Visual Studio 2010 installed, then it will just add this as an available project type.  On the other hand, if you regularly live in SQL Server Management Studio (SSMS) then you will really enjoy the SQL Source Control integration from within SSMS.  Both tool sets store their database model in script files.  In SSDT, these are on your file system like other source files; in SQL Source Control, these are stored in the folder structure in your source control system, and you can always GET them to your file system if you want to browse them directly. For me, the key differentiating factors are 1) a single, unified check-in, and 2) migration scripts.  How you value those two features will likely make your decision for you. Unified Check-In If you do a continuous-integration (CI) style of development that triggers an automated build with unit testing on every check-in of source code, and you use Visual Studio for the rest of your development, then you will want to really consider SSDT.  Because it is just another project in Visual Studio, it can be added to your existing Solution, and you can then do a complete, or unified single check-in of all changes whether they are application or database changes.  This is simply not possible with SQL Source Control because it is in a different development tool (SSMS instead of Visual Studio) and there is no way to do one unified check-in between the two.  You CAN do really fast back-to-back check-ins, but there is the possibility that the automated build that is triggered from the first check-in will cause your unit tests to fail and the CI tool to report that you broke the build.  Of course, the automated build that is triggered from the second check-in which contains the “other half” of your changes should pass and so the amount of time that the build was broken may be very, very short, but if that is very, very important to you, then SQL Source Control just won’t work; you’ll have to use SSDT. Refactoring and Migrations If you work on a mature system, or on a not-so-mature but also not-so-well-designed system, where you want to refactor the database schema as you go along, but you can’t have data suddenly disappearing from your target system, then you’ll probably want to go with SQL Source Control.  As I wrote previously, there are a number of changes which you can make to your database that the comparison tools (both from Microsoft and Red Gate) simply cannot handle without the possibility (or probability) of data loss.  Currently, SSDT only offers you the ability to inject PRE and POST custom deployment scripts.  There is no way to insert your own script in the middle to override the default behavior of the tool.  In version 3.0 of SQL Source Control (Early Access version now available) you have that ability to create your own custom migration script to take the place of the commands that the tool would have done, and ensure the preservation of your data.  Or, even if the default tool behavior would have worked, but you simply know a better way then you can take control and do things your way instead of theirs. You Decide In the environment I work in, our automated builds are not triggered off of check-ins, but off of the clock (currently once per night) and so there is no point at which the automated build and unit tests will be triggered without having both sides of the development effort already checked-in.  Therefore having a unified check-in, while handy, is not critical for us.  As for migration scripts, these are critically important to us.  We do a lot of new development on systems that have already been in production for years, and it is not uncommon for us to need to do a refactoring of the database.  Because of the maturity of the existing system, that often involves data migrations or other additional SQL tasks that the comparison tools just can’t detect on their own.  Therefore, the ability to create a custom migration script to override the tool’s default behavior is very important to us.  And so, you can see why we will continue to use Red Gate SQL Source Control for the foreseeable future.

    Read the article

  • Weird graphical errors in console and on computer shut down

    - by Mark A.
    I am all new to Ubuntu (and Linux in general) and I am experiencing some strange graphic on my screen. Console #1 (ctrl+alt+f1): Exactly the same happens on all the other consoles (2-6), and the consoles don't seem to work. And I see the same when I hibernate or shut down my computer, but not when I suspend it. I was thinking that it may have something to do with the SiS 671 video driver work around that I use? http://ubuntuforums.org/showpost.php?p=11476910&postcount=773 Any ideas how to fix this?

    Read the article

  • Can't complete dropbox installation from behind proxy in Ubuntu 11.10

    - by Mark Jones
    Problem: My PC on campus sits behind a proxy (requiring authentication) and I can't setup Dropbox. I am convinced that this is a proxy issue as I can't setup Ubuntu one either (but I don't use Ubuntu One so that is not a problem). I have looked at the Ubuntu One fix but it seems to be to modify settings explicitly related to Ubuntu One. I can install the nautilus-dropbox package (compiled from source and from .deb package from website and from software centre) but once I click OK from the "Dropbox Installation" dialog box (prompting me to download the proprietary daemon) the installation just freezes with the OK button pressed. When I look at its process in System Monitor its waiting channel is inet_wait_for_connect. I have set the following proxy directives thus far: Added mj22:**@proxy.waikato.ac.nz:80 information to network proxy settings under network in settings. Added http_host and http_port variables under gconf-editor-system-proxy Added 'host', 'authentication_password' 'authentication_user' and ticked 'user authentication' and 'use_http_proxy' under gconf-editor-system-http_proxy Added export http_proxy="http://mj22:**@proxy.waikato.ac.nz:80/" to /etc/bash.bashrc Added Acquire::http::proxy "http://mj22:**@proxy.waikato.ac.nz:80/"; to /etc/apt/apt.conf (which is what I imagine is letting Software Center retrieve packages). (where ** is my password) I have also added the equivalent ftp and https lines for the above entries. I get the internet fine and Software Centre can download packages but thats it. Related issues: The software centre can't fetch reviews (but can download packages). When trying to add an online account in Gnome 3 a dialog pop up appears with "Error getting a Request Token: Cannot connect to proxy (proxy.waikato.ac.nz)" Updates: After some time (10mins ish) Dropbox shows an error dialog box that reads: Trouble connecting to Dropbox servers. Maybe your internet connection is down, or you need to set you http_proxy environment variable. Is there a way I can see what environment variables are currently set?

    Read the article

  • Where to store shaders

    - by Mark Ingram
    I have an OpenGL renderer which has a Scene member variable. The Scene object can contain N SceneObjects. I use these SceneObjects for storing the vertex position and any transforms. My question is, where should shaders be stored in this arrangement? I guess they need to be in a central location because multiple objects can use the same shader. But then each object needs access to the shader because it needs to set attributes into the shader. Does anyone have any advice?

    Read the article

  • Is it common to prototype in a higher level language?

    - by Mark Canlas
    I'm currently toying with the idea of embarking on a project that far exceeds my current programming ability in a language I have very little real world experience in (C). Would it be valuable to prototype in a higher level language that I'm more familiar with (like Perl/Python/Ruby/C#) just so I can get the overall design going? Ultimately, the final product is performance sensitive, hence the choice of C, but I'm afraid not knowing C well will make me lose the forest for the trees. While searching for similar questions, I noticed one fellow mention that programmers used to prototype in Prolog, then crank it out in assembler.

    Read the article

  • Turn based battle and formula

    - by Mark Chapman
    I'm building a game called DVP(Digimon Virtual Pet), and in this game other than taking care of your digimon, You can also battle and breed them. I'm working on the battle system (making it first cause the actual pet system will be easy compared to the netplay, or 39DLL)but here is the problem I don't want it to be "too" simple or "too" complicated, but I do want to go by a certain formula. There are str, def, spd, and int. Strength: How hard the attacking digimon is hitting Defense: How much damage your digimon can defend when being attacked Speed: The chance of you missing the enemy Intelligence (battle knowledge): The chance of you hitting a critical hit or defending a critical hit. I can make a super simple turn based example, but I don't know how exactly to make the formulas for what I've explained above, any help?

    Read the article

  • Managing Your First SharePoint Project or Team

    - by Mark Rackley
    (*editor’s note* If you have proper SharePoint Training, know the difference between a site and a site collection, and have the utmost respect for the knowledge of your SharePoint team skip this blog and go directly to meetdux.com, do not pass go, do not collect $200… otherwise, please proceed) Dear Mr. or Mrs. I-know-nothing-about-SharePoint-but-hey,-I-have-manager-in-my-title-so-I’ll-tell-you-how-to-your-job, Thank you so much for joining the Acme corporation. We appreciate your eagerness and willingness to jump in and help us accomplish all of our goals here at acme (these roadrunner rockets don’t make themselves). You may have noticed that we have this thing called SharePoint lying around and we have invested some time in money to make it not a complete piece of garbage. So, I thought I’d give you some pointers to help make your stay here enjoyable and productive. Yeah… you don’t really know SharePoint Just because you had a mysite at your last organization or had a SharePoint 2003 team site does NOT mean you comprehend the vastness that is SharePoint. You don’t know what’s going on behind the scenes. You don’t know what should and should not be done. No, we CAN’T just query the SQL database directly. Yes, it really does take that long. No, we can’t do that out-of-the-box. Your experience doesn’t mean as much as you think it means… Yes, I’m aware that you co-created the internet with Al Gore and have been managing projects since I was blowing up GI Joe figures with firecrackers, however SharePoint is not like anything you have worked with before from a management perspective. Please don’t tell us the proper way to do our job or tell us how “you” would do it, and PLEASE don’t utter the words “I used to do some .NET development so let me know if you get stuck and need some guidance.” It MAY be possible for a incredible project manager to manage a SharePoint project and not understand the technology, but if you force your ideas on us or treat us like we don’t really know what we’re doing then you will prove yourself to NOT be one of those types. Oh no you didn’t… Please don’t tell us how you can bring in a group of guys of Kazakhstan to do the project for $20/hr. There are many companies out there who can do some really crappy SharePoint work and we don’t want to be stuck maintaining their junk. Do you know what it means to deploy a solution? Neither do some of those companies out there. However, there are are few AWESOME consulting firms out there but $150/hr is cheap for these guys. Believe me, it’s worth it though. You get what you pay for! Show us some respect We truly do appreciate and value your opinion and experience, but when we tell you something is different in SharePoint don’t be condescending and dismiss OUR experience and opinions. We have spent a lot of time and energy learning a very complicated technology that can open up a world of possibilities when used properly. We just want to make sure it is used properly. It’s not the same as .NET development. It’s not like a regular web application. There’s more going on behind the scenes than you can possibly fathom. Have a little faith in us please and listen when we talk. You may actually learn a thing or two. Take some time to learn the technology There is hope… you don’t have to be totally worthless. Take some time to learn SharePoint. Learn what it is and what it can do. Invest some time in learning our SharePoint environment. What’s our logical architecture and taxonomy? What governance do we have in place? If you just thought “huh?” then yes, I’m talking to you. Sincerely, Your SharePoint Team (This rant is not pointed at any particular organization or person. If you think it’s about you, you are wrong. This is just a general rant based upon things people have told me and things I’ve seen. If you don’t think it applies to you, please move on. If you think you might be guilty of handling your SharePoint team the wrong way, then just please listen, learn, and have a little faith in your team. You all have the same goal in mind. Also, take the time to learn something about SharePoint, you will all be less frustrated with each other.)

    Read the article

  • PASS Summit 2013 Review

    - by Ajarn Mark Caldwell
    As a long-standing member of PASS who lives in the greater Seattle area and has attended about nine of these Summits, let me start out by saying how GREAT it was to go to Charlotte, North Carolina this year.  Many of the new folks that I met at the Summit this year, upon hearing that I was from Seattle, commented that I must have been disappointed to have to travel to the Summit this year after 5 years in a row in Seattle.  Well, nothing could be further from the truth.  I cheered loudly when I first heard that the 2013 Summit would be outside Seattle.  I have many fond memories of trips to Orlando, Florida and Grapevine, Texas for past Summits (missed out on Denver, unfortunately).  And there is a funny dynamic that takes place when the conference is local.  If you do as I have done the last several years and saved my company money by not getting a hotel, but rather just commuting from home, then both family and coworkers tend to act like you’re just on a normal schedule.  For example, I have a young family, and my wife and kids really wanted to still see me come home “after work”, but there are a whole lot of after-hours activities, social events, and great food to be enjoyed at the Summit each year.  Even more so if you really capitalize on the opportunities to meet face-to-face with people you either met at previous summits or have spoken to or heard of, from Twitter, blogs, and forums.  Then there is also the lovely commuting in Seattle traffic from neighboring cities rather than the convenience of just walking across the street from your hotel.  So I’m just saying, there are really nice aspects of having the conference 2500 miles away. Beyond that, the training was fantastic as usual.  The SQL Server community has many outstanding presenters and experts with deep knowledge of the tools who are extremely willing to share all of that with anyone who wants to listen.  The opening video with PASS President Bill Graziano in a NASCAR race turned dream sequence was very well done, and the keynotes, as usual, were great.  This year I was particularly impressed with how well attended were the Professional Development sessions.  Not too many years ago, those were very sparsely attended, but this year, the two that I attended were standing-room only, and these were not tiny rooms.  I would say this is a testament to both the maturity of the attendees realizing how important these topics are to career success, as well as to the ever-increasing skills of the presenters and the program committee for selecting speakers and topics that resonated with people.  If, as is usually the case, you were not able to get to every session that you wanted to because there were just too darn many good ones, I encourage you to get the recordings. Overall, it was a great time as these events always are.  It was wonderful to see old friends and make new ones, and the people of Charlotte did an awesome job hosting the event and letting their hospitality shine (extra kudos to SQLSentry for all they did with the shuttle, maps, and other event sponsorships).  We’re back in Seattle next year (it is a release year, after all) but I would say that with the success of this year’s event, I strongly encourage the Board and PASS HQ to firmly reestablish the location rotation schedule.  I’ll even go so far as to suggest standardizing on an alternating Seattle – Charlotte schedule, or something like that. If you missed the Summit this year, start saving now, and register early, so you can join us!

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >