Search Results

Search found 4449 results on 178 pages for 'david martin'.

Page 28/178 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • Desktop shortcut to create a new desktop shortcut doesn't do anything

    - by David M. Karr
    It's weird that creating desktop shortcuts is currently so primitive. I found the following: Create Shortcut / launcher on Desktop in Ubuntu 12.04 That helps. However, if there's something wrong with the shortcut, it just doesn't do anything. For instance, I tried to create a shortcut for this command line to create a shortcut. When I double-click it, it just does nothing. This is the resulting text of my "Create Desktop Shortcut.desktop" file on the desktop: [Desktop Entry] Version=1.0 Type=Application Terminal=false Icon[en_CA]=gnome-panel-launcher Exec=/usr/bin/gnome-desktop-item-edit ~/Desktop/ --create-new Name[en_CA]=Create Desktop Shortcut Name=Create Desktop Shortcut Icon=gnome-panel-launcher As I said, when I double-click this, or right-click it and select Open, nothing happens. Is there a log file where something about this would be written to?

    Read the article

  • Enabling SSL Requests on Jdev's Integrated Weblogic

    - by Christian David Straub
    Often times you will want to enable SSL access for such things as secure login or secure signup. By default, the integrated WLS that ships with JDev does not listen to SSL requests. However, this is easily fixed.Just navigate to http://127.0.0.1:7101/console. This will deploy the console app where you can configure WLS. By default the login credentials are:username: weblogicpassword: weblogic1Then go to Environment -> Servers -> DefaultServer. Check the "SSL Listen Port Enabled" box and your server will now listen to SSL requests (just make sure to use the listen port that is specified).For added security, you can always check while processing your request that it is going through an SSL connection by first checking HttpServletRequest.isSecure().

    Read the article

  • Desktop Apps in the Windows 8 Store?!?! That&rsquo;s Impossible!!!

    - by David Paquette
    Or is it? Since Microsoft announced the Windows Store, the official word has been that desktop apps could not be distributed in the store.  But this morning I noticed this: That’s odd, but after clicking on one of these, I see that all it does is link to the website where you download Visual Studio So really, it’s not a desktop app in the store.  It’s more of an ad for a desktop app. Interestingly enough, despite the menu being ALL CAPS, Visual Studio 2012 is getting a nearly solid 5/5 stars.

    Read the article

  • Have I fixed my partition problem with os x 10.5.8? Are my GPT and MBR back to normal?

    - by David Schaap
    I'm new to linux and I have overstepped by abilities. I tried dual booting os x 10.5.8 with ubuntu 11.10 with rEFIt, but I been having problems with partitioning. Instead of enduring more headaches, I've made the decision to simply use ubuntu on VirtualBox. I've tried to return my HDD to normal, but I am looking for confirmation that my partitions are ok. Here is the report from partition inspector: *** Report for internal hard disk *** Current GPT partition table: # Start LBA End LBA Type 1 409640 233917359 Mac OS X HFS+ Current MBR partition table: # A Start LBA End LBA Type 1 1 234441647 ee EFI Protective MBR contents: Boot Code: GRUB Partition at LBA 409640: Boot Code: None File System: HFS Extended (HFS+) Listed in GPT as partition 1, type Mac OS X HFS+ Also, my HDD directory has a bunch of extra folders in them and they appear to be ubuntu related, although it is no longer installed. folders like bin, sbin, cores, var, user, and so on. Those folders aren't supposed to be there, right? Thanks in advance.

    Read the article

  • Analytics: Test events not showing up - how to troubleshoot?

    - by David Parks
    I've got 3 profiles: Master, Raw Data, and Test, on the Test profile I have no filters configured. I want to test using some events. I created a local HTML file as shown below to generate some test data that I could play with in Analytics. But the events never showed up in Analytics. I wonder what I might be doing wrong? Is the lack of a domain an issue maybe? <html><head></head><body>Login_popup_complete_Facebook <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-28554309-1']); _gaq.push(['_trackPageview']); _gaq.push(['_trackEvent', 'Login popup completed', 'Facebook']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> </body></html>

    Read the article

  • Push or Pull Mobile Coupons?

    - by David Dorf
    Mobile phones allow consumers to receive coupons in context, which increases their relevance and therefore redemption rates. Using your current location, you can get coupons that can be redeemed nearby for the things you want now. Receiving a coupon for something you wanted last week or something you might buy next month just isn't as valuable. I previously talked about Placecast and their concept of pushing offers to mobile phones that transgress "geo-fences" around points of interest, like store locations. This push model is an automatic reminder there are good deals just up ahead. This model works well in dense cities where people walk, but I question how effective it will be in the suburbs where people are driving. McDonald's recently ran a campaign in Finland where they pushed offers to GPS devices when cars neared their restaurants. Amazingly, they achieved a 7% click-through rate. But 8coupons.com sees things differently. They prefer the pull model that requires customers to initiate a search for nearby coupons, and they've done some studies to better understand what "nearby" means. It turns out that there are concentric search circles that emanate from your home and work. From inner to outer, people search for food, drink, shopping, and entertainment. Intuitively, that feels about right. So the question is, do consumers prefer the push or pull model for offers? No doubt the market is big enough for both. These days its not good enough to just know who your customers are -- you also need to know where they are so you can catch them in the right moment. According to Borrell Associates, redemption rates of mobile coupons are 10x that of traditional mail and newspaper coupons. One thing is for sure; assuming 85% of consumers regularly spend money within 5 miles of home and work, location-based coupons make tons of sense.

    Read the article

  • 5 Ways to Determine Mobile Location

    - by David Dorf
    In my previous post, I mentioned the importance of determining the location of a consumer using their mobile phone.  Retailers can track anonymous mobile phones to determine traffic patterns both inside and outside their stores.  And with consumers' permission, retailers can send location-aware offers to mobile phones; for example, a coupon for cereal as you walk down that aisle.  When paying with Square, your location is matched with the transaction.  So there are lots of reasons for retailers to want to know the location of their customers.  But how is it done? I thought I'd dive a little deeper on that topic and consider the approaches to determining location. 1. Tower Triangulation By comparing the relative signal strength from multiple antenna towers, a general location of a phone can be roughly determined to an accuracy of 200-1000 meters.  The more towers involved, the more accurate the location. 2. GPS Using Global Positioning Satellites is more accurate than using cell towers, but it takes longer to find the satellites, it uses more battery, and it won't well indoors.  For geo-fencing applications, like those provided by Placecast and Digby, cell towers are often used to determine if the consumer is nearing a "fence" then switches to GPS to determine the actual crossing of the fence. 3. WiFi Triangulation WiFi triangulation is usually more accurate than using towers just because there are so many more WiFi access points (i.e. radios in routers) around. The position of each WiFi AP needs to be recorded in a database and used in the calculations, which is what Skyhook has been doing since 2008.  Another advantage to this method is that works well indoors, although it usually requires additional WiFi beacons to get the accuracy down to 5-10 meters.  Companies like ZuluTime, Aisle411, and PointInside have been perfecting this approach for retailers like Meijer, Walgreens, and HomeDepot. Keep in mind that a mobile phone doesn't have to connect to the WiFi network in order for it to be located.  The WiFi radio in the phone only needs to be on.  Even when not connected, WiFi radios talk to each other to prepare for a possible connection. 4. Hybrid Approaches Naturally the most accurate approach is to combine the approaches described above.  The more available data points, the greater the accuracy.  Companies like ShopKick like to add in acoustic triangulation using the phone's microphone, and NearBuy can use video analytics to increase accuracy. 5. Magnetic Fields The latest approach, and this one is really new, takes a page from the animal kingdom.  As you've probably learned from guys like Marlin Perkins, some animals use the Earth's magnetic fields to navigate.  By recording magnetic variations within a store, then matching those readings with ones from a consumer's phone, location can be accurately determined.  At least that's the approach IndoorAtlas is taking, and the science seems to bear out.  It works well indoors, and doesn't require retailers to purchase any additional hardware.  Keep an eye on this one.

    Read the article

  • How to get the revision history of a branch with bzrlib

    - by David Planella
    I'm trying to get a list of committers to a bzr branch. I know I can get it through the command line with something along these lines: bzr log -n0 | grep committer | sed -e 's/^[[:space:]]*committer: //' | uniq However, I'd like to get that list programmatically with bzrlib. After having looked at the bzrlib documentation, I can't manage to find out how I would even get the full list of revisions from my branch. Any hints on how to get the full history of revisions from a branch with bzrlib, or ultimately, the list of committers?

    Read the article

  • Retail CEO Interviews

    - by David Dorf
    Businessweek's 2012 Interview Issue has interviews with three retail CEOs that are worth a quick read.  I copied some excerpts below, but please follow the links to the entire interviews. Ron Johnson, CEO JCPenney Take me through your merchandising. One of the things I learned from Steve [Jobs]—Steve said three times in his life he had the chance to be part of the change of an interface. If you change the interface, you can dramatically change the entire experience of the product. For Steve, that was the mouse, the scroll wheel on the iPod, and then the [touch]screen. What we’re trying to do here is change the interface of retail. What we call that is the street, and you’re standing in the middle of it. When you walk into a store today, you’re overwhelmed by merchandise. There is a narrow aisle. Typically, it’s filled with product on tables and you’re overwhelmed with the noise of signs and promotions. Especially in the age of the Internet, the idea of going to a very large store and having so much abundance is actually not very appealing. The first thing you find here is you’re inspired. I have used the mannequins. The street is actually this new navigation path for a retail store. So if you come in here—you’ll notice that these aisles are 14 feet wide. These are wider than Nordstrom’s (JWN). Slide show of JCPenney store. Walter Robb, co-CEO Whole Foods What did you learn from the recent recession about selling groceries?It was a lot of humble pie, because our sales experienced a drop that I have never seen in 32 years of retail. Customers left us in droves. We also learned that there were some very loyal customers who loved Whole Foods (WFM), people who said, “I like what you stand for. I like coming here. I like this experience.” That was very affirming. I think the realization was that we’ve got some customers, and we need to make sure we know who they are. So instead of chasing every customer out there, we started doing customer discussion groups. We were growing for growth’s sake, which is not a good strategy. We were chasing the rainbow. We cut the growth in half overnight and said, “All right, slow down. Let’s make sure we’re doing this better and more thoroughly and more thoughtfully.” This company is a mission-based company. This company started to change the world by bringing healthier food to the world. It’s not about the money, it’s about the impact, and this company is back on track as a result of those experiences. Video of Whole Foods store tour. Kay Krill, CEO Ann Taylor You’ve worked in retail all your life. What drew you to it?I graduated from college, and I did not know what I wanted to do. Macy’s (M) came to campus to interview for their training program, and I thought, “Let me give it a try.” I got the job and fell in love with the industry. The president of Macy’s at the time said, “If you don’t wake up every morning dying to go to work, then retailing is not for you; it has to be in your blood.” It was in my blood. I love the fact that every day is different. You can get to be creative one day, financial the next day, marketing the next. I love going to stores. I love talking to associates. I love talking to clients. There’s not a predictable day.

    Read the article

  • Loading sound in XNA without the Content Pipeline

    - by David Gouveia
    I'm working on a "Game Maker"-type of application for Windows where the user imports his own assets to be used in the game. I need to be able to load this content at runtime on the engine side. However I don't want the user to have to install anything more than the XNA runtime, so calling the content pipeline at runtime is out. For images I'm doing fine using Texture2D.FromStream. I've also noticed that XNA 4.0 added a FromStream method to the SoundEffect class but it only accepts PCM wave files. I'd like to support more than wave files though, at least MP3. Any recommendations? Perhaps some C# library that would do the decoding to PCM wave format.

    Read the article

  • iPad Jailbreak &ndash; On The Lam In A Single Day

    - by David Totzke
    Exploits to jailbreak the iPhone are well known.  The iPad runs on the iPhone 3.2 firmware.  What this means is that the iPad was shipped with known security vulnerabilities that would allow someone to gain root access to the device. Nice. It’s not like these are security vulnerabilities that are known but have no exploits.  The exploits are numerous and freely available. Of course, if you fit the demographic, you probably have nothing to worry about. Magical and Revolutionary?  Hardly. Dave Just because I can…

    Read the article

  • ODI 12c - Getting up and running fast

    - by David Allan
    Here's a quick A-B-C to show you how to quickly get up and running with ODI 12c, from getting the software to creating a repository via wizard or the command line, then installing an agent for running load plans and the like. A. Get the software from OTN and install studio. Check out this viewlet here for quickly doing this. B. Create a repository using the RCU, check out this viewlet here which uses the FMW Repository Creation Utility.  You can also silently create (and drop) a repository using the command line, this is really easy. .\rcu -silent -createRepository -connectString yourhost:1521:orcl.st-users.us.oracle.com -dbUser sys -dbRole sysdba -useSamePasswordForAllSchemaUsers true -schemaPrefix X -component ODI -component IAU  -component IAU_APPEND  -component IAU_VIEWER -component OPSS < passwords.txt where the passwords file contains info such as; sysdba_passwd newschema_passwd odi_user_passwd D workreposname workrepos_passwd  You can find details about the silent use of RCU here in the FMW documentation. C. Quickly create an agent for executing load plans and the like -  there is a great OBE for this, check it out here. If you are on your laptop and just wanting as minimal an agent as possible then this link is a must. With these three steps you are ready to get to the fun stuff! Check out more OBEs here - keep on the lookout for more!

    Read the article

  • Adjust resolution in xfce4 virtualbox guest guest

    - by David
    I have Virtualbox 4.1.2_ubuntur3859 installed on an Ubuntu 11.10 host, running a guest ubuntu server 10.04 with xfce4 and xorg installed with no-install-recommends. I have installed guest additions, but the maximum resolution in the display settings is 800x600. I have read related questions: How to change resolution of the VirtualBox (Ubuntu guest and host)? Higher screen resolution in VirtualBox? upgrading VirtualBox 3.2.10 breaks my guest Ubuntu screen resolution Ubuntu as guest OS (with Vista host) stuck at 800x600 resolution but none contain the solution to my issue. Am I missing any particular packages that would allow me to change resolution? I would like to keep the machine as small as possible.

    Read the article

  • How to impale and stack targets correctly according to the collider and its coordinate?

    - by David Dimalanta
    I'm making another simple game, a catch game, where a spawning target game object must be captured using a skewer to impale it. Here how: At the start, the falling object (in red) will fall in a vertical direction (in blue) When aimed properly, the target will fall down along the line of the skewer. (in blue) Then, another target is spawned and will fall vertically. (in red) When aimed successfully again in a streak, the second target will fall along the skewer and stacked. Same process over and over when another target is spawned. However, when I test run it on the scene tab in Unity, when impaled several targets, instead of a smooth flow and stacking it ended up overlaying it instead of stacking it up like a pancake. Here's what it look like: As I noticed when reaching the half-way of my progress, I tried to figure out how to deal with collider bodies without sticking each other so that it will actually stack like in the example of the image at no. 3. Here's the script code I added in the target game object: using UnityEngine; using System.Collections; public class ImpaleStateTest : MonoBehaviour { public GameObject target; public GameObject skewer; public bool drag = false; private float stack; private float impaleCount; void Start () { stack = 0; impaleCount = 0; } void Update () { if(drag) { target.transform.position = new Vector3 (DragTest.dir.transform.position.x, DragTest.dir.transform.position.y - 0.35f, 0); target.transform.rotation = DragTest.degrees; target.rigidbody2D.fixedAngle = true; target.rigidbody2D.isKinematic = true; target.rigidbody2D.gravityScale = 0; if(Input.GetMouseButton(0)) { Debug.Log ("Skewer: " + DragTest.dir.transform.position.x); Debug.Log ("Target: " + target.transform.position.x); } } } void OnTriggerEnter2D(Collider2D collider) { impaleCount++; Debug.Log ("Impaled " + impaleCount + " time(s)!"); drag = true; audio.Play (); } } Aside from that, I'm not sure if it's right but, the only way to stick the impaled targets while dragging the skewer left or right is to get the X coordinates from the skewer only. Is there something else to recommend it in order to improve this behavior as realistic as possible? Please help.

    Read the article

  • How do I get google to see keywords on a one page web application site?

    - by David
    I'm going to have to link to the web site to explain this, http://www.diagram.ly, it's a free service, so I hope this doesn't break advertising rules. Basically, it's a one page web application, I don't want to create a web site for it. Some background text loads and if JavaScript is enabled, the web application itself then loads. The problem is that Google only seems to be picking up the title of the page and the text on the footer, so the site only appear on Google search for very limited text (based on the title and meta description mostly). I was hoping that search engines would pick up on the background text and index that. The text is factual, not keyword stuffed. Yahoo seems to pick up the text, just not Google. Does anyone have any experience of how Google would view such a site and where I could put the text for a better result? Edit I should mention that Google Webmaster Tools lists the site keywords as "Component, diagramly, feed, mxgraph, share and twitter". Basically the footer and little else.

    Read the article

  • Transparent Technology from Amazon

    - by David Dorf
    Amazon has been making some interesting moves again, this time in the augmented humanity area.  Augmented humanity is about helping humans overcome their shortcomings using technology.  Putting a powerful smartphone in your pocket helps you in many ways like navigating streets, communicating with far off friends, and accessing information.  But the interface for smartphones is somewhat limiting and unnatural, so companies have been looking for ways to make the technology more transparent and therefore easier to use. When Apple helped us drop the stylus, we took a giant leap forward in simplicity.  Using touchscreens with intuitive gestures was part of the iPhone's original appeal.  People don't want to know that technology is there -- they just want the benefits.  So what's the next leap beyond the touchscreen to make smartphones even easier to use? Two natural ways we interact with the world around us is by using sight and voice.  Google and Apple have been using both in their mobile platforms for limited uses cases.  Nobody actually wants to type a text message, so why not just speak it?  Any if you want more information about a book, why not just snap a picture of the cover?  That's much more accurate than trying to key the title and/or author. So what's Amazon been doing?  First, Amazon released a new iPhone app called Flow that allows iPhone users to see information about products in context.  Yes, its an augmented reality app that uses the phone's camera to view products, and overlays data about the products on the screen.  For the most part it requires the barcode to be visible to correctly identify the product, but I believe it can also recognize certain logos as well.  Download the app and try it out but don't expect perfection.  Its good enough to demonstrate the concept, but its far from accurate enough.  (MobileBeat did a pretty good review.)  Extrapolate to the future and we might just have a heads-up display in our eyeglasses. The second interesting area is voice response, for which Siri is getting lots of attention.  Amazon may have purchased a voice recognition company called Yap, although the deal is not confirmed.  But it would make perfect sense, especially with the Kindle Fire in Amazon's lineup. I believe over the next 3-5 years the way in which we interact with smartphones will mature, and they will become more transparent yet more important to our daily lives.  This will, of course, impact the way we shop, making information more readily accessible than it already is.  Amazon seems to be positioning itself to be at the forefront of this trend, so we should be watching them carefully.

    Read the article

  • How to Handle frame rates and synchronizing screen repaints

    - by David Kroukamp
    I would first off say sorry if the title is worded incorrectly. Okay now let me give the scenario I'm creating a 2 player fighting game, An average battle will include a Map (moving/still) and 2 characters (which are rendered by redrawing a varying amount of sprites one after the other). Now at the moment I have a single game loop limiting me to a set number of frames per second (using Java): Timer timer = new Timer(0, new AbstractAction() { @Override public void actionPerformed(ActionEvent e) { long beginTime; //The time when the cycle begun long timeDiff; //The time it took for the cycle to execute int sleepTime; //ms to sleep (< 0 if we're behind) int fps = 1000 / 40; beginTime = System.nanoTime() / 1000000; //execute loop to update check collisions and draw gameLoop(); //Calculate how long did the cycle take timeDiff = System.nanoTime() / 1000000 - beginTime; //Calculate sleep time sleepTime = fps - (int) (timeDiff); if (sleepTime > 0) {//If sleepTime > 0 we're OK ((Timer)e.getSource()).setDelay(sleepTime); } } }); timer.start(); in gameLoop() characters are drawn to the screen ( a character holds an array of images which consists of their current sprites) every gameLoop() call will change the characters current sprite to the next and loop if the end is reached. But as you can imagine if a sprite is only 3 images in length than calling gameLoop() 40 times will cause the characters movement to be drawn 40/3=13 times. This causes a few minor anomilies in the sprited for some charcters So my question is how would I go about delivering a set amount of frames per second in when I have 2 characters on screen with varying amount of sprites?

    Read the article

  • Building Publishing Pages in Code

    - by David Jacobus
    Originally posted on: http://geekswithblogs.net/djacobus/archive/2013/10/27/154478.aspxOne of the Mantras we developers try to follow: Ensure that the solution package we deliver to the client is complete.  We build Web Parts, Master Pages, Images, CSS files and other artifacts that we push to the client with a WSP (Solution Package) And then we have them finish the solution by building their site pages by adding the web parts to the site pages.       I am a proponent that we,  the developers,  should minimize this time consuming work and build these site pages in code.  I found a few blogs and some MSDN documentation but not really a complete solution that has all these artifacts working in one solution.   What I am will discuss and provide a solution for is a package that has: 1.  Master Page 2.  Page Layout 3.  Page Web Parts 4.  Site Pages   Most all done in code without the development team or the developers having to finish up the site building process spending a few hours or days completing the site!  I am not implying that in Development we do this. In fact,  we build these pages incrementally testing our web parts, etc. I am saying that the final action in our solution is that we take all these artifacts and add them to the site pages in code, the client then only needs to activate a few features and VIOLA their site appears!.  I had a project that had me build 8 pages like this as part of the solution.   In this blog post, I am taking a master page solution that I have called DJGreenMaster.  On My Office 365 Development Site it looks like this:     It is a generic master page for a SharePoint 2010 site Along with a three column layout.  Centered with a footer that uses a SharePoint List and Web Part for the footer links.  I use this master page a lot in my site development!  Easy to change the color and site logo with a little CSS.   I am going to add a few web parts for discussion purposes and then add these web parts to a site page in code.    Lets look at the solution package for DJ Green Master as that will be the basis project for building the site pages:   What you are seeing  is a complete solution to add a Master Page to a site collection which contains: 1.  Master Page Module which contains the Master Page and Page Layout 2.  The Footer Module to add the Footer Web Part 3.  Miscellaneous modules to add images, JQuery, CSS and subsite page 4.  3 features and two feature event receivers: a.  DJGreenCSS, used to add the master page CSS file to Style Sheet Library and an Event Receiver to check it in. b.  DJGreenMaster used to add the Master Page and Page Layout.  In an Event Receiver change the master page to DJGreenMaster , create the footer list and check the files in. c.  DJGreenMasterWebParts add the Footer Web Part to the site collection. I won’t go over the code for this as I will give it to you at the end of this blog post. I have discussed creating a list in code in a previous post.  So what we have is the basis to begin what is germane to this discussion.  I have the first two requirements completed.  I need now to add page web parts and the build the pages in code.  For the page web parts, I will use one downloaded from Codeplex which does not use a SharePoint custom list for simplicity:   Weather Web Part and another downloaded from MSDN which is a SharePoint Custom Calendar Web Part, I had to add some functionality to make the events color coded to exceed the built-in 10 overlays using JQuery!    Here is the solution with the added projects:     Here is a screen shot of the Weather Web Part Deployed:   Here is a screen shot of the Site Calendar with JQuery:     Okay, Now we get to the final item:  To create Publishing pages.   We need to add a feature receiver to the DJGreenMaster project I will name it DJSitePages and also add a Event Receiver:       We will build the page at the site collection level and all of the code necessary will be contained in the event receiver.   Added a reference to the Microsoft.SharePoint.Publishing.dll contained in the ISAPI folder of the 14 Hive.   First we will add some static methods from which we will call  in our Event Receiver:   1: private static void checkOut(string pagename, PublishingPage p) 2: { 3: if (p.Name.Equals(pagename, StringComparison.InvariantCultureIgnoreCase)) 4: { 5: 6: if (p.ListItem.File.CheckOutType == SPFile.SPCheckOutType.None) 7: { 8: p.CheckOut(); 9: } 10:   11: if (p.ListItem.File.CheckOutType == SPFile.SPCheckOutType.Online) 12: { 13: p.CheckIn("initial"); 14: p.CheckOut(); 15: } 16: } 17: } 18: private static void checkin(PublishingPage p,PublishingWeb pw) 19: { 20: SPFile publishFile = p.ListItem.File; 21:   22: if (publishFile.CheckOutType != SPFile.SPCheckOutType.None) 23: { 24:   25: publishFile.CheckIn( 26:   27: "CheckedIn"); 28:   29: publishFile.Publish( 30:   31: "published"); 32: } 33: // In case of content approval, approve the file need to add 34: //pulishing site 35: if (pw.PagesList.EnableModeration) 36: { 37: publishFile.Approve("Initial"); 38: } 39: publishFile.Update(); 40: }   In a Publishing Site, CheckIn and CheckOut  are required when dealing with pages in a publishing site.  Okay lets look at the Feature Activated Event Receiver: 1: public override void FeatureActivated(SPFeatureReceiverProperties properties) 2: { 3:   4:   5:   6: object oParent = properties.Feature.Parent; 7:   8:   9:   10: if (properties.Feature.Parent is SPWeb) 11: { 12:   13: currentWeb = (SPWeb)oParent; 14:   15: currentSite = currentWeb.Site; 16:   17: } 18:   19: else 20: { 21:   22: currentSite = (SPSite)oParent; 23:   24: currentWeb = currentSite.RootWeb; 25:   26: } 27: 28:   29: //create the publishing pages 30: CreatePublishingPage(currentWeb, "Home.aspx", "ThreeColumnLayout.aspx","Home"); 31: //CreatePublishingPage(currentWeb, "Dummy.aspx", "ThreeColumnLayout.aspx","Dummy"); 32: }     Basically we are calling the method Create Publishing Page with parameters:  Current Web, Name of the Page, The Page Layout, Title of the page.  Let’s look at the Create Publishing Page method:   1:   2: private void CreatePublishingPage(SPWeb site, string pageName, string pageLayoutName, string title) 3: { 4: PublishingSite pubSiteCollection = new PublishingSite(site.Site); 5: PublishingWeb pubSite = null; 6: if (pubSiteCollection != null) 7: { 8: // Assign an object to the pubSite variable 9: if (PublishingWeb.IsPublishingWeb(site)) 10: { 11: pubSite = PublishingWeb.GetPublishingWeb(site); 12: } 13: } 14: // Search for the page layout for creating the new page 15: PageLayout currentPageLayout = FindPageLayout(pubSiteCollection, pageLayoutName); 16: // Check or the Page Layout could be found in the collection 17: // if not (== null, return because the page has to be based on 18: // an excisting Page Layout 19: if (currentPageLayout == null) 20: { 21: return; 22: } 23:   24: 25: PublishingPageCollection pages = pubSite.GetPublishingPages(); 26: foreach (PublishingPage p in pages) 27: { 28: //The page allready exists 29: if ((p.Name == pageName)) return; 30:   31: } 32: 33:   34:   35: PublishingPage newPage = pages.Add(pageName, currentPageLayout); 36: newPage.Description = pageName.Replace(".aspx", ""); 37: // Here you can set some properties like: 38: newPage.IncludeInCurrentNavigation = true; 39: newPage.IncludeInGlobalNavigation = true; 40: newPage.Title = title; 41: 42: 43:   44:   45: 46:   47: //build the page 48:   49: 50: switch (pageName) 51: { 52: case "Homer.aspx": 53: checkOut("Courier.aspx", newPage); 54: BuildHomePage(site, newPage); 55: break; 56:   57:   58: default: 59: break; 60: } 61: // newPage.Update(); 62: //Now we can checkin the newly created page to the “pages” library 63: checkin(newPage, pubSite); 64: 65: 66: }     The narrative in what is going on here is: 1.  We need to find out if we are dealing with a Publishing Web.  2.  Get the Page Layout 3.  Create the Page in the pages list. 4.  Based on the page name we build that page.  (Here is where we can add all the methods to build multiple pages.) In the switch we call Build Home Page where all the work is done to add the web parts.  Prior to adding the web parts we need to add references to the two web part projects in the solution. using WeatherWebPart.WeatherWebPart; using CSSharePointCustomCalendar.CustomCalendarWebPart;   We can then reference them in the Build Home Page method.   Let’s look at Build Home Page: 1:   2: private static void BuildHomePage(SPWeb web, PublishingPage pubPage) 3: { 4: // build the pages 5: // Get the web part manager for each page and do the same code as below (copy and paste, change to the web parts for the page) 6: // Part Description 7: SPLimitedWebPartManager mgr = web.GetLimitedWebPartManager(web.Url + "/Pages/Home.aspx", System.Web.UI.WebControls.WebParts.PersonalizationScope.Shared); 8: WeatherWebPart.WeatherWebPart.WeatherWebPart wwp = new WeatherWebPart.WeatherWebPart.WeatherWebPart() { ChromeType = PartChromeType.None, Title = "Todays Weather", AreaCode = "2504627" }; 9: //Dictionary<string, string> wwpDic= new Dictionary<string, string>(); 10: //wwpDic.Add("AreaCode", "2504627"); 11: //setWebPartProperties(wwp, "WeatherWebPart", wwpDic); 12:   13: // Add the web part to a pagelayout Web Part Zone 14: mgr.AddWebPart(wwp, "g_685594D193AA4BBFABEF2FB0C8A6C1DD", 1); 15:   16: CSSharePointCustomCalendar.CustomCalendarWebPart.CustomCalendarWebPart cwp = new CustomCalendarWebPart() { ChromeType = PartChromeType.None, Title = "Corporate Calendar", listName="CorporateCalendar" }; 17:   18: mgr.AddWebPart(cwp, "g_20CBAA1DF45949CDA5D351350462E4C6", 1); 19:   20:   21: pubPage.Update(); 22:   23: } Here is what we are doing: 1.  We got  a reference to the SharePoint Limited Web Part Manager and linked/referenced Home.aspx  2.  Instantiated the a new Weather Web Part and used the Manager to add it to the page in a web part zone identified by ID,  thus the need for a Page Layout where the developer knows the ID’s. 3.  Instantiated the Calendar Web Part and used the Manager to add it to the page. 4. We the called the Publishing Page update method. 5.  Lastly, the Create Publishing Page method checks in the page just created.   Here is a screen shot of the page right after a deploy!       Okay!  I know we could make a home page look much better!  However, I built this whole Integrated solution in less than a day with the caveat that the Green Master was already built!  So what am I saying?  Build you web parts, master pages, etc.  At the very end of the engagement build the pages.  The client will be very happy!  Here is the code for this solution Code

    Read the article

  • What are the main programmers web sites in other countries?

    - by David
    Looking at the demographics of sites like this one, there seems to be a very heavy US weighing of users. I know there have been discussions about using English as a programmer, but the are some large countries with large programmer bases that must be using localized sites. Countries like Brazil and China, in particular, what sites do programmers go for discussions? Do a lot of the threads link up to English speaking sites like these? In France they obviously prefer their own language, what do the equivalent sites look like? I don't want to turn this into an English language speaking discussion like this one, it's more a general trans-cultural programming curiosity.

    Read the article

  • MDM for Tax Authorities

    - by david.butler(at)oracle.com
    In last week’s MDM blog, we discussed MDM in the Public Sector. I want to continue that thread. After all, no industry faces tougher data quality problems than governmental organizations, and few industries suffer more significant down side consequences to poor operations than local, state and federal governments. One key challenge area is taxation. Tax Authorities face a multitude of IT challenges. Firstly, the data used in tax calculations is increasing in volume and complexity. They must improve service by introducing multi-channel contact centers and self-service capabilities. Security concerns necessitate increasingly sophisticated data protection procedures. And cost constraints are driving Tax Authorities to rely on off-the-shelf software for many of their functional areas. Compounding these issues is the fact that the IT architectures in operation at most revenue and collections agencies are very complex. They typically include multiple, disparate operational and analytical systems across which the sum total of data about individual constituents is fragmented. To make matters more complicated, taxation is not carried out by a single jurisdiction, and often sources of income including employers, investments and other sources of taxable income and deductions must also be tracked and shared among tax authorities. Collectively, these systems are involved in tax assessment and collections, risk analysis, scoring, tracking, auditing and investigation case management. The Problem of Constituent Data Management The infrastructure described above makes it very difficult to create a consolidated representation of a given party. Differing formats and data models mean that a constituent may be represented in one way in one system and in a different way in another. Individual records are frequently inaccurate, incomplete, out of date and/or inconsistent with other records relating to the same constituent. When constituent data must be aggregated and scored, information within each system must be rationalized and normalized so the agency can produce a constituent information file (CIF) that provides a single source of truth about that party. If information about that constituent changes, each system in turn must be updated. There have been many attempts to solve this problem with technology: from consolidating transactional systems to conducting manual systems integration projects and superimposing layers of business intelligence and analytics. All these approaches can be successful in solving a portion of the problem at a specific point in time, but without an enterprise perspective, anything gained is quickly lost again. Oracle Constituent Data Mastering for Tax Authorities: A Single View of the Constituent Oracle has a flexible and long-term solution to the problem of securely integrating and managing constituent data. The Oracle Solution for mastering Constituent Data for Tax Authorities is based on two core product offerings: Oracle Customer Hub and – optionally – Oracle Application Integration Architecture (AIA). Customer Hub is a master data management (MDM) product that centralizes, de-duplicates, and enriches constituent data. It unifies fragmented information without disrupting existing business processes or IT investments. Role based data access and privacy rules guarantee maximum security and privacy. Data is continuously and automatically synchronized with all source systems. With the Oracle Customer Hub managing the master constituent identity, every department can capture transaction activity against the same record, improving reporting accuracy, employee productivity, reliability of constituent analytics, and day-to-day constituent relationships. Oracle Application Integration Architecture provides a collection of core pre-built processes to support out of the box Master Data Governance across Oracle Customer Hub, Siebel CRM, and Oracle E-Business Suite. It also provides a framework to enable MDM integrations with other Oracle and non-Oracle applications. Oracle AIA removes some of the key inhibitors to implementing a service-oriented architecture (SOA) by providing a pre-built SOA-based middleware foundation as well as industry-optimized service oriented applications, all built around a SOA governance model that encourages effective design and reuse. I encourage you to read Oracle Solution for Mastering Constituents Data for Public Sector – Tax Authorities by Roberto Negro. It is an outstanding whitepaper that describes how the Oracle MDM solution allows you to create a unified, reconciled source of high-quality constituent data and gain an accurate single view of each constituent. This foundation enables you to lower the costs associated with data quality and integration and create a tax organization that is efficient, secure and constituent-centric. Also, don’t forget the upcoming webcast on Thursday, February 10th: Deliver Improved Services to Citizens at Lower Cost to your Organization Our Guest Speaker is Ruben Spekle, from Capgemini. He will also provide insight into Public Sector Master Data Management and Case Management implementations including one that was executed for a Dutch Government Agency. If you are interested in how governmental organizations from around the world are using MDM to advance their cause, click here to register for the webcast.

    Read the article

  • How do I get same scrollbar style for gtk-2.0 and gtk-3.0 apps?

    - by David López
    Sorry for my English mistakes, I'm Spanish. I'm using Ubuntu 11.10 in a tablet. I've removed overlay-scrollbars and I have increased the scrollbars size to use them with fingers. In /usr/share/themes/Ambiance/gtk-2.0/gtkrc I've changed: GtkScrollbar::slider-width = 23 GtkScrollbar::min-slider-length = 51 and added: GtkScrollbar::has-backward-stepper = 0 GtkScrollbar::has-forward-stepper = 0 In /usr/share/themes/Ambiance/gtk-3.0/gtk-widgets.css I've changed: GtkScrollbar-min-slider-length: 51; GtkRange-slider-width: 23; (in .scrollbar item) Now my scrollbars are usable with fingers, but they seem different for gtk-2.0 and gtk-3.0 apps. In the picture the left scrollbar is a gtk-2.0 app and the right one is a gtk-3.0 I want to setup gtk2.0 bar to be exactly the same as gtk3.0, that is Make upper and lower extremes empty (oranges circles in the picture) Reduce the length of the 3 horizontal lines (black ellipse) Can somebody help me? Thanks. Hola. Uso ubuntu 11.10 en una tableta; he quitado overlay-scrollbars y he incrementado el tamaño de las barras para poder usarlas con los dedos. Concretamente en /usr/share/themes/Ambiance/gtk-2.0/gtkrc be cambiado GtkScrollbar::slider-width = 23 GtkScrollbar::min-slider-length = 51 y añadido GtkScrollbar::has-backward-stepper = 0 GtkScrollbar::has-forward-stepper = 0 En /usr/share/themes/Ambiance/gtk-3.0/gtk-widgets.css he cambiado GtkScrollbar-min-slider-length: 51; GtkRange-slider-width: 23; (en el apartado.scrollbar) Mis barras son manejables con dedos, pero se ven muy distintas para aplicaciones gtk-2.0 y gtk-3.0. La barra de la izquierda de la imagen es 2.0 y la de la derecha es 3.0 Quiero configurar las barras 2.0 exactamente como las 3.0, para lo que necesito Vaciar los extremos de la barra (círculos naranjas en la imagen) Reducir la longitud de las 3 líneas horizontales (elipses negras en la imagen) ¿Alguna idea? Gracias.

    Read the article

  • Windows Phone 7 Development &ndash; Useful Links

    - by David Turner
    Here are some excellent links for anyone developing for Windows Phone 7: J.D. Meier’s Windows Phone Developer Guidance Map – this is immense.  Also check out the Silverlight version Justin Angel’s site – some really great articles on unlocked roms, automation and Continuous Integration Windows Phone 7 Development Best Practices Wiki Jeff Blankenburg’s 31 days of Windows Phone 7 This post of Links to sample code for Windows Phone Tim Heuers blog, particularly this post of Tips and Tricks Kevin Marshall's blog, particularly the epic WP7 Development Tips Part 1 post Code Samples for Windows Phone on MSDN If you have unlocked your phone for development, then you can use the WPConnect tool to connect to the device rather than using the Zune client.  I found it useful to pin a shortcut to WPConnect in my Start Menu. The Performance Counters displayed when you debug your app on a device are useful for seeing things like frame rate and memory usage, this page on MSDN explains what the numbers mean.  Jeff Blankenburg covers this in more details on his blog I also came across this set of links to tutorials recently which looks very useful. Creating Windows Phone 7 Application and Marketplace Icons: http://expression.microsoft.com/en-us/gg317447.aspx

    Read the article

  • OWB – How to update OWB after Database Cloning

    - by David Allan
    One of the most commonly asked questions led to one of the most commonly accessed support documents (strange that) for OWB is the document describing how to update the OWB repository details after cloning the Oracle database. The document in the Oracle support site has id 434272.1, and is titled 'How To Update Warehouse Builder After A Database Cloning (Doc ID 434272.1)'. This post is really for me to remember the document id;-)

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >