Search Results

Search found 1755 results on 71 pages for 'publish'.

Page 22/71 | < Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >

  • Facebook JS API: How to get FB.Connect.streamPublish to use auto_publish

    - by Prody
    I'm trying to publish stuff to someone's wall. It works, but I'm also trying to use the auto_publish feature so the user will only get one popup granting the publish_stream extended permission. So I set streamPublish's auto_publish to true, but I still get the popup asking me if I want to publish and/or edit the message. What am I doing wrong? Here's what I'm running: FB.ensureInit(function () { FB.Facebook.get_sessionState().waitUntilReady(function() { FB.Connect.showPermissionDialog("publish_stream", function(perms) { if (perms == "publish_stream") { FB.Facebook.apiClient.friends_get(null, function(result) { var markup = ""; var targets = result; targets = [testFriendsIDForTesting]; var attachment = { name: "Blablabla", href: window.location.href, description: "description", caption: "caption" }; var actionLinks = [{ text: "View", href: window.location.href }]; var num_targets = targets.length; for (var i=0; i<num_targets; i++) { var fId = targets[i]; FB.Connect.streamPublish("none", attachment, actionLinks, fId, "none", null, false); } }); } }); }); });

    Read the article

  • resolving overloads in boost.python

    - by swarfrat
    I have a C++ class like this: class ConnectionBase { public: ConnectionBase(); template <class T> Publish(const T&); private: virtual void OnEvent(const Overload_a&) {} virtual void OnEvent(const Overload_b&) {} }; My templates & overloads are a known fixed set of types at compile time. The application code derives from ConnectionBase and overrides OnEvent for the events it cares about. I can do this because the set of types is known. OnEvent is private because the user never calls it, the class creates a thread that calls it as a callback. The C++ code works. I have wrapped this in boost.python, I can import it and publish from python. I want do create the equivalent of the following in python : class ConnectionDerived { public: ConnectionDerived(); private: virtual void OnEvent(const Overload_b&) { // application code } }; But ... since python isn't typed, and all the boost.python examples I've seen dealing with internals are on the C++ side, I'm a little puzzled as to how to do this. How do I override specific overloads?

    Read the article

  • Trace PRISM / CAL events (best practice?)

    - by Christian
    Ok, this question is for people with either a deep knowledge of PRISM or some magic skills I just lack (yet). The Background is simple: Prism allows the declaration of events to which the user can subscribe or publish. In code this looks like this: _eventAggregator.GetEvent<LayoutChangedEvent>().Subscribe(UpdateUi, true); _eventAggregator.GetEvent<LayoutChangedEvent>().Publish("Some argument"); Now this is nice, especially because these events are strongly typed, and the declaration is a piece of cake: public class LayoutChangedEvent : CompositePresentationEvent<string> { } But now comes the hard part: I want to trace events in some way. I had the idea to subscribe using a lambda expression calling a simple log message. Worked perfectly in WPF, but in Silverlight there is some method access error (took me some time to figure out the reason).. If you want to see for yourself, try this in Silverlight: eA.GetEvent<VideoStartedEvent>().Subscribe(obj => TraceEvent(obj, "vSe", log)); If this would be possible, I would be happy, because I could easily trace all events using a single line to subscribe. But it does not... The alternative approach is writing a different functions for each event, and assign this function to the events. Why different functions? Well, I need to know WHICH event was published. If I use the same function for two different events I only get the payload as argument. I have now way to figure out which event caused the tracing message. I tried: using Reflection to get the causing event (not working) using a constructor in the event to enable each event to trace itself (not allowed) Any other ideas? Chris PS: Writing this text took me most likely longer than writing 20 functions for my 20 events, but I refuse to give up :-) I just had the idea to use postsharp, that would most likely work (although I am not sure, perhaps I end up having only information about the base class).. Tricky and so unimportant topic...

    Read the article

  • Can the Flash CS4 [embed] tag be made to export assets to frame 2 rather than frame 1?

    - by Tim Knauf
    We're working on a Flash CS4 project where the main .fla file has ballooned in size and 'Publish' is taking forever. I suspect a large amount of the size (and at least some of the compile time) is due to the quantity of audio symbols in the library. I would love to remove this unnecessary bloat from the .fla file. I've experimented with removing an audio symbol from the library and using the [embed] metadata tag instead, like so: [Embed(source="audio/music/EndOfLevelDitty.mp3")] public var EndOfLevelDitty:Class The resulting published file works perfectly, but there is a problem. Our game uses a preloader on the first frame of the timeline, so all other classes need to be exported in frame 2 (as set in Publish Settings ActionScript 3.0 Settings). So a size report normally begins like this: Frame # Frame Bytes Total Bytes Scene ------- ----------- ----------- ---------------- 1 284515 284515 Scene 1 2 5485305 5769820 (AS 3.0 Classes Export Frame) However, if I use an [embed] tag on a small sound, my size report is now: Frame # Frame Bytes Total Bytes Scene ------- ----------- ----------- ---------------- 1 363320 363320 Scene 1 2 5407240 5770560 (AS 3.0 Classes Export Frame) As you can see, the embedded sound has been exported into frame 1 rather than frame 2. If I were to embed all sounds in this manner, the size of frame 1 would grow to be huge, and users would be looking at a white screen for ages before the preloader frame even loaded. So my question is this: can I use an [embed] tag but have the embedded asset export in frame 2 instead of frame 1? Project constraints: Our team composition means we can't change to pure Flex at this stage. The compiled .swf needs to be 'all in one', so we can't split the preloader into a separate file, and we can't access external resources. Edit: I'd also settle for having the audio in an embedded library SWC, but there seems to be no way to make that embed in frame 2 either; it always ends up in frame 1.

    Read the article

  • Creating and publishing exel file in MOSS 2007 using data from SQL sever.

    - by Diomos
    Hello, I need help in this matter: We have a template of exel file in which all calculations are already set. User can request a 'report'. Idea is to create a button on our site (SharePoint portal). After clicking on it a new exel file is generated. This means to get actual data from database (SQL server 2005 SP2), import them into template, let all calculations to generate proper data and then allow user to see this file. For now it's enough to publish final exel file in document library. I am quite new in WSS 3.0 and MOSS 2007 and I need some advice in what can be the best solution. Looks like a quite complex task for me. Is there some direct way how to accomplish this? Or maybe I need one tool to get data from database and to import this data into exel file (SSRS?) and other tool to publish it in document library (MOSS7 Exel services?). I heard something about PerformancePoint Server 2007, is this a way to follow? Thanks forward for any advice!

    Read the article

  • PubSubHubBub Hubs

    - by PartlyCloudy
    Hi, I'm currently building a live web application based upon the PubSubHubBub protocol. However, I encountered several issues. First, I'm in search of a hub application that I can run on my server. There are several applications, but most of them are not mature yet, or they don't support the 0.3 spec. The official google hub runs on the Google App Engine and can even be executed locally. Unfortunately, "Tasks will not run automatically. Push the 'Run' button to execute each task." This behaviour is useful for debugging and understanding the workflow, but in some live tests, it would be nice not to invoke all tasks manually. Is there a way to tweak the local app engine due automatically run tasks? Next, I have a question concerning the spec itself. The Google reference implementation provides the initial publish method bound to the outpoint uri + /publish. But this is not reflected in the specs. So are there any mature hubs that can be run locally for debugging? Or are there ways to configure the offical google app engine hub to run locally and to execute tasks directly? Thanks in advance

    Read the article

  • ClassCircularityError when running tomcat6 from eclipse

    - by zenmonkey
    I'm using Eclipse 3.5, tomcat runtime is set as tomcat 6.0.26, vm is jdk 1.6.17 (macosx) When I try to run a web application from eclipse Java EE perspective I keep seeing this error in the console: Caused by: java.lang.ClassCircularityError: java/util/logging/LogRecord at com.adsafe.util.SimpleFormatter.format(SimpleFormatter.java:11) at java.util.logging.StreamHandler.publish(StreamHandler.java:179) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:88) at java.util.logging.Logger.log(Logger.java:458) at java.util.logging.Logger.doLog(Logger.java:480) at java.util.logging.Logger.logp(Logger.java:596) at org.apache.juli.logging.DirectJDKLog.log(DirectJDKLog.java:165) at org.apache.juli.logging.DirectJDKLog.info(DirectJDKLog.java:115) at org.apache.catalina.core.ApplicationContext.log(ApplicationContext.java:644) at org.apache.catalina.core.ApplicationContextFacade.log(ApplicationContextFacade.java:251) at org.apache.catalina.core.StandardWrapper.unavailable(StandardWrapper.java:1327) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1130) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:993) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4187) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4496) at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045) at org.apache.catalina.core.StandardHost.start(StandardHost.java:785) at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045) at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443) at org.apache.catalina.core.StandardService.start(StandardService.java:519) at org.apache.catalina.core.StandardServer.start(StandardServer.java:710) at org.apache.catalina.startup.Catalina.start(Catalina.java:581) ... 6 more java/util/logging/LogRecord implements Serializable, so i am not sure where the circular reference could have creeped in. Anyone see this before and any anyone know how to fix this?

    Read the article

  • How to post a poll on the Facebook wall

    - by Bengt
    Hi, I'm trying to convert my poll app into a Facebook iframe app. My app is written in PHP and uses some Ajax calls to vote at a poll. In the application canvas everything is working fine, but of course I want to get the poll on the wall of a user too. Unfortunately I'm not able to find out how I can post a simple poll with some radio buttons for the options on the wall. I know how to publish images, text, audio files and links to the wall, but I have no idea how to publish my poll on the wall. And I don't just want to use links to vote, I want the user be able to choose a radio button. Does anyone have an idea how to do this or where to find information about doing this? I'm stuck there now for a while and it gets pretty frustrating. I'm using the new Graph API by the way. Or is this impossible? But I don't think so. Any help is appreciated. Bengt

    Read the article

  • Easy bidirectional communication via P2P NetStream

    - by andsve
    I've been looking into the P2P support in Flash 10, using Adobe Stratus service. I have successfully been able to send data from one user to another, put my problem is that I haven't figured out how to send data back in some easy way (or as some kind of response to the first call). What I'm currently doing; First set up a connection with Stratus service nc = new NetConnection(); nc.addEventListener(NetStatusEvent.NET_STATUS, ncStatusHandler); nc.connect(APPLICATION_URL + DEVELOPER_KEY); On the "server" side I do: sendStream = new NetStream(nc, NetStream.DIRECT_CONNECTIONS); sendStream.addEventListener(NetStatusEvent.NET_STATUS, sendStreamHandler); sendStream.publish("file"); And on the "client" side: // remoteFileID.text is manually copied by the user from the server (which is nc.nearID). recvStream = new NetStream(nc, remoteFileID.text); recvStream.client = this; recvStream.addEventListener(NetStatusEvent.NET_STATUS, recvStreamHandler); recvStream.play("file"); Then I call a remote function on the client: ... sendStream.send("aRemoteFunction", parameterData); ... Now my problem; I want to do the same from the client to the server, to notify that everything went well, or something failed. From what I understand, I will have to setup a new NetStream from the client to the server (i.e publish on the client and play on the server). But to accomplish this, the server need to know the nc.nearID on the client. Is it possible to get that ID without forcing the user to manually copy it from the client to server? Or, is there an easier way for the client to talk back to the server that I am missing?

    Read the article

  • item-not-found(404) when trying to get a node using Smackx pubsub

    - by DustMason
    I'm trying to use the latest Smackx trunk to get and then subscribe to a pubsub node. However, openfire just sends me a back an error: item not found (404). I am instantiating the java objects from ColdFusion, so my code snippets might look funny but maybe someone will be able to tell me what I've forgotten. Here's how I create the node: ftype = createObject("java", "org.jivesoftware.smackx.pubsub.FormType"); cform = createObject("java", "org.jivesoftware.smackx.pubsub.ConfigureForm").init(ftype.submit); cform.setPersistentItems(true); cform.setDeliverPayloads(true); caccess = createObject("java", "org.jivesoftware.smackx.pubsub.AccessModel"); cform.setAccessModel(caccess.open); cpublish = createObject("java", "org.jivesoftware.smackx.pubsub.PublishModel"); cform.setPublishModel(cpublish.open); cform.setMaxItems(99); manager = createObject("java", "org.jivesoftware.smackx.pubsub.PubSubManager").init(XMPPConnection); myNode = manager.createNode("subber", cform); And here's how I am trying to get to it (in a different section of code): manager = createObject("java", "org.jivesoftware.smackx.pubsub.PubSubManager").init(XMPPConnection); myNode = manager.getNode("subber"); Immediately upon creating the node I seem to be able to publish to it like so: payload = createObject("java", "org.jivesoftware.smackx.pubsub.SimplePayload").init("book","pubsub:test:book","<book xmlns='pubsub:test:book'><title>Lord of the Rings</title></book>"); item = createObject("java", "org.jivesoftware.smackx.pubsub.Item").init(payload); myNode.publish(item); However, it is the getNode() call that is causing my code to error. I have verified that the nodes are being created by checking the DB used by my openfire server. I can see them in there, properly attributed as leaf nodes, etc. Any advice? Anyone else out there doing anything with XMPP and ColdFusion? I have had great success sending and receiving messages with CF and Smack just haven't had the pubsub working yet :) Thanks!

    Read the article

  • Auto-resolving a hostname in WCF Metadata Publishing

    - by Mike C
    I am running a self-hosted WCF service. In the service configuration, I am using localhost in my BaseAddresses that I hook my endpoints to. When trying to connect to an endpoint using the WCF test client, I have no problem connecting to the endpoint and getting the metadata using the machine's name. The problem that I run into is that the client that is generated from metadata uses localhost in the endpoint URLs it wants to connect to. I'm assuming that this is because localhost is the endpoint URL published by metadata. As a result, any calls to the methods on the service will fail since localhost on the calling machine isn't running the service. What I would like to figure out is if it is possible for the service metadata to publish the proper URL to a client depending on the client who is calling it. For example, if I was requesting the service metadata from a machine on the same network as the server the endpoint should be net.tcp://MYSERVER:1234/MyEndpoint. If I was requesting it from a machine outside the network, the URL should be net.tcp://MYSERVER.mydomain.com:1234/MyEndpoint. And obviously if the client was on the same machine, THEN the URL could be net.tcp://localhost:1234/MyEndpoint. Is this just a flaw in the default IMetadataExchange contract? Is there some reason the metadata needs to publish the information in a non-contextual way? Is there another way I should be configuring my BaseAddresses in order to get the functionality I want? Thanks, Mike

    Read the article

  • EOFException in ObjectInputStream Only happens with Webstart not by java(w).exe ?!

    - by Houtman
    Hi, Anyone familiar with the differences in starting with Webstart(javaws.exe) compared to starting the app. using java.exe or javaw.exe regarding streams ? This is the exception which i ONLY get when using Webstart : java.io.EOFException at java.io.ObjectInputStream$PeekInputStream.readFully(Unknown Source) at java.io.ObjectInputStream$BlockDataInputStream.readShort(Unknown Source) at java.io.ObjectInputStream.readStreamHeader(Unknown Source) at java.io.ObjectInputStream.<init>(Unknown Source) at fasttools.jtools.dss.api.core.remoting.thinclient.RemoteSocketChannel.<init>(RemoteSocketChannel.java:77) This is how i setup the connections on both sides //==Server side== //Thread{ Socket mClientSocket = cServSock.accept(); new DssServant(mClientSocket).start(); //} DssServant(Socket socket) throws DssException { try { OutputStream mOutputStream = new BufferedOutputStream( socket.getOutputStream() ); cObjectOutputStream = new ObjectOutputStream(mOutputStream); cObjectOutputStream.flush(); //publish streamHeader InputStream mInputStream = new BufferedInputStream( socket.getInputStream() ); cObjectInputStream = new ObjectInputStream(mInputStream); .. } catch (IOException e) { .. } .. } //==Client side== public RemoteSocketChannel(String host, int port, IEventDispatcher eventSubscriptionHandler) throws DssException { cHost = host; port = (port == 0 ? DssServer.PORT : port); try { cSocket = new Socket(cHost, port); OutputStream mOutputStream = new BufferedOutputStream( cSocket.getOutputStream() ); cObjectOut = new ObjectOutputStream(mOutputStream); cObjectOut.flush(); //publish streamHeader InputStream mInputStream = new BufferedInputStream( cSocket.getInputStream() ); cObjectIn = new ObjectInputStream(mInputStream); } catch (IOException e) { .. } .. } Thanks [EDIT] Webstart console says: Java Web Start 1.6.0_19 Using JRE version 1.6.0_19-b04 Java HotSpot(TM) Client VM Server is running same 1.6u19

    Read the article

  • How do I tell if an action is a lambda expression?

    - by Keith
    I am using the EventAgregator pattern to subscribe and publish events. If a user subscribes to the event using a lambda expression, they must use a strong reference, not a weak reference, otherwise the expression can be garbage collected before the publish will execute. I wanted to add a simple check in the DelegateReference so that if a programmer passes in a lambda expression and is using a weak reference, that I throw an argument exception. This is to help "police" the code. Example: eventAggregator.GetEvent<RuleScheduler.JobExecutedEvent>().Subscribe ( e => resetEvent.Set(), ThreadOption.PublisherThread, false, // filter event, only interested in the job that this object started e => e.Value1.JobDetail.Name == jobName ); public DelegateReference(Delegate @delegate, bool keepReferenceAlive) { if (@delegate == null) throw new ArgumentNullException("delegate"); if (keepReferenceAlive) { this._delegate = @delegate; } else { //TODO: throw exception if target is a lambda expression _weakReference = new WeakReference(@delegate.Target); _method = @delegate.Method; _delegateType = @delegate.GetType(); } } any ideas? I thought I could check for @delegate.Method.IsStatic but I don't believe that works... (is every lambda expression a static?)

    Read the article

  • Publishing a WCF Server and client and their endpoints

    - by Ahmadreza
    Imagine developing a WCF solution with two projects (WCF Service/ and web application as WCF Client). As long as I'm developing these two projects in visual studio and referencing service to client (Web Application) as server reference there is no problem. Visual studio automatically assign a port for WCF server and configure all needed configuration including Server And Client binging to something like this in server: <service behaviorConfiguration="DefaultServiceBehavior" name="MYWCFProject.MyService"> <endpoint address="" binding="wsHttpBinding" contract="MYWCFProject.IMyService"> <identity> <dns value="localhost" /> </identity> </endpoint> <host> <baseAddresses> <add baseAddress="http://localhost:8731/MyService.svc" /> </baseAddresses> </host> </service> and in client: <client> <endpoint address="http://localhost:8731/MyService.svc" binding="wsHttpBinding" bindingConfiguration="WSHttpBinding_IMyService" contract="MyWCFProject.IMyService" name="WSHttpBinding_IMyService"> <identity> <dns value="localhost" /> </identity> </endpoint> </client> The problem is I want to frequently publish this two project in two different servers as my production servers and Service url will be "http://mywcfdomain/MyService.svc". I don't want to change config file every time I publish my server project. The question is: is there any feature in Visual Studio 2008 to automatically change the URLs or I have to define two different endpoints and I set them within my code (based on a parameter in my configuration for example Development/Published).

    Read the article

  • Generate custom RSS/Atom feed with SyndicationFeedFormatter made from XML

    - by Sentax
    I have followed this article and implemented my service and I can open the web browser and see the test data being published. I would like to create a custom formatted response, as for my needs this will not be published to the internet and it's an isolated feed that other devices on the local network could read to get the data I'm publishing. I'd like to create an XML document and publish it instead of using the SyndicationItem that is being used in the article to display title, author, description, etc. Would like to create something simple to be published: <MyData> <ID>33883</ID> <Title>The Name</Title> <Artist>The Artist</Artist> </MyData> I know how to create that in an XMLWriter, but how to publish in a SyndicationFeedFormatter that is the return type for the function in the article? I have seen the XmlSyndicationContent class but haven't seen any practical examples that would accomplish what I want to do.

    Read the article

  • how to return a list using SwingWorker

    - by Ender
    I have an assignment where i have to create an Image Gallery which uses a SwingWorker to load the images froma a file, once the image is load you can flip threw the image and have a slideshow play. I am having trouble getting the list of loaded images using SwingWorker. This is what happens in the background it just publishes the results to a TextArea // In a thread @Override public List<Image> doInBackground() { List<Image> images = new ArrayList<Image>(); for (File filename : filenames) { try { //File file = new File(filename); System.out.println("Attempting to add: " + filename.getAbsolutePath()); images.add(ImageIO.read(filename)); publish("Loaded " + filename); System.out.println("Added file" + filename.getAbsolutePath()); } catch (IOException ioe) { publish("Error loading " + filename); } } return images; } } when it is done I just insert the images in a List<Image> and that is all it does. // In the EDT @Override protected void done() { try { for (Image image : get()) { list.add(image); } } catch (Exception e) { } } Also I created an method that returns the list called getImages() what I need to get is the list from getImages() but doesn't seam to work when I call execute() for example MySwingWorkerClass swingworker = new MySwingWorkerClass(log,list,filenames); swingworker.execute(); imageList = swingworker.getImage() Once it reaches the imageList it doesn't return anything the only way I was able to get the list was when i used the run() instead of the execute() is there another way to get the list or is the run() method the only way?. or perhaps i am not understanding the Swing Worker Class.

    Read the article

  • wordpress generating slow mysql queries - is it index problem?

    - by tash
    Hello Stack Overflow I've got very slow Mysql queries coming up from my wordpress site. It's making everything slow and I think this is eating up CPU usage. I've pasted the Explain results for the two most frequently problematic queries below. This is a typical result - although very occasionally teh queries do seem to be performed at a more normal speed. I have the usual wordpress indexes on the database tables. You will see that one of the queries is generated from wordpress core code, and not from anything specific - like the theme - for my site. I have a vague feeling that the database is not always using the indexes/is not using them properly... Is this right? Does anyone know how to fix it? Or is it a different problem entirely? Many thanks in advance for any help anyone can offer - it is hugely appreciated Query: [wp-blog-header.php(14): wp()] SELECT SQL_CALC_FOUND_ROWS wp_posts.* FROM wp_posts WHERE 1=1 AND wp_posts.post_type = 'post' AND (wp_posts.post_status = 'publish' OR wp_posts.post_status = 'private') ORDER BY wp_posts.post_date DESC LIMIT 0, 6 id select_type table type possible_keys key key_len ref rows Extra 1 SIMPLE wp_posts ref type_status_date type_status_date 63 const 427 Using where; Using filesort Query time: 34.2829 (ms) 9) Query: [wp-content/themes/LMHR/index.php(40): query_posts()] SELECT SQL_CALC_FOUND_ROWS wp_posts.* FROM wp_posts WHERE 1=1 AND wp_posts.ID NOT IN ( SELECT tr.object_id FROM wp_term_relationships AS tr INNER JOIN wp_term_taxonomy AS tt ON tr.term_taxonomy_id = tt.term_taxonomy_id WHERE tt.taxonomy = 'category' AND tt.term_id IN ('217', '218', '223', '224') ) AND wp_posts.post_type = 'post' AND (wp_posts.post_status = 'publish' OR wp_posts.post_status = 'private') ORDER BY wp_posts.post_date DESC LIMIT 0, 6 id select_type table type possible_keys key key_len ref rows Extra 1 PRIMARY wp_posts ref type_status_date type_status_date 63 const 427 Using where; Using filesort 2 DEPENDENT SUBQUERY tr ref PRIMARY,term_taxonomy_id PRIMARY 8 func 1 Using index 2 DEPENDENT SUBQUERY tt eq_ref PRIMARY,term_id_taxonomy,taxonomy PRIMARY 8 antin1_lovemusic2010.tr.term_taxonomy_id 1 Using where Query time: 70.3900 (ms)

    Read the article

  • ASP.NET Web Application: use 1 or multiple virtual directories

    - by tster
    I am working on a (largish) internal web application which has multiple modules (security, execution, features, reports, etc.). All the pages in the app share navigation, CSS, JS, controls, etc. I want to make a single "Web Application" project, which includes all the pages for the app, then references various projects which will have the database and business logic in them. However, some of the people on the project want to have separate projects for the pages of each module. To make this more clear, this is what I'm advocating to be the projects. /WebInterface* /SecurityLib /ExecutionLib etc... And here is what they are advocating: /SecurityInterface* /SecutiryLib /ExecutionInterface* /ExecutionLib etc... *project will be published to a virtual directory of IIS Basically What I'm looking for is the advantages of both approaches. Here is what I can think of so far: Single Virtual Directory Pros Modules can share a single MasterPage Modules can share UserControls (this will be common) Links to other modules are within the same Virtual directory, and thus don't need to be fully qualified. Less chance of having incompatible module versions together. Multiple Virtual Directories Pros Can publish a new version of a single module without disrupting other modules Module is more compartmentalized. Less likely that changes will break other modules. I don't buy those arguments though. First, using load balanced servers (which we will have) we should be able to publish new versions of the project with zero downtime assuming there are no breaking database changes. Second, If something "breaks" another module, then there is either an improper dependency or the break will show up eventually in the other module, when the developers copy over the latest version of the UserControl, MasterPage or dll. As a point of reference, there are about 10 developers on the project for about 50% of their time. The initial development will be about 9 months.

    Read the article

  • I getting undefined using JSON in jQuery why?

    - by YoniGeek
    Im learning some JSON, Im trying to list some data about dogs from twitter...but I can't really present the data...I believe that the error is inside map-method...something I'm missing...thanks for yr help <body> <h1>U almost there!!</h1> <script src="jquery-1.7.1.js"> </script> <script> // PubSub (function( $ ) { var o = $( {} ); $.each({ trigger: 'publish', on: 'subscribe', off: 'unsubscribe' }, function( key, val ) { jQuery[val] = function() { o[key].apply( o, arguments ); }; }); })( jQuery ); $.getJSON('http://search.twitter.com/search.json?q=dogs&callback=?', function( info) { $.publish( 'twitter/info', info ); }); // ... $.subscribe( 'twitter/info', function( e, info ) { $('body').html( $.map( info, function( obj) { // <--- here it's error, something Im missing right? return '<li>' + obj.text + '</li>'; }).join('') ); }); </script> </body> </html>

    Read the article

  • Deal with update location for click-once.

    - by Assimilater
    I'm not sure how many people here are experts with visual studios, but I'd imagine a handful (not to raise expectations but to appeal to your egos :P). I'm working primarily in visual basic for now (though I hope to switch to c# in the near future and maybe a java or web app). Basically I'm trying to create an update feature that will work similarly to how common programs such as firefox or itunes update automatically. There is supposed to be provided functionality for this in what is called click once. I carry out the following procedures and get the following errors when trying to change the update url of my program to a password-protected ftp location. Go to project properties Go to publish click updates click browse click FTP Site Under Server put: web###.opentransfer.com Under Port: 21 Under Directory put: CMSOFT Passive mode is selected (which is what filezilla tells me the server is accessed with) Anonymous User is unselected and a username and password are typed in Push Ok Under Update location it shows: ftp://web###.opentransfer.com/CMSOFT I push Ok I see a message box titled Microsoft Visual Basic 2010 Express with an x icon Publish.UpdateUrl: The string must be a fully qualified URL or UNC path, for example "http://www.microsoft.com/myapplication" or "\server\myapplication". I've tried changing the directory to "CMSOFT/PQCM.exe" and the results are the same...hope this was descriptive enough.

    Read the article

  • DRY jQuery for RESTful PUT/DELETE links

    - by Aupajo
    I'm putting together PUT/DELETE links, a la Rails, which when clicked create a POST form with an hidden input labelled _method that sends the intended request type. I want to make it DRYer, but my jQuery knowledge isn't up to it. HTML: <a href="/articles/1" class="delete">Destroy Article 1</a> <a href="/articles/1/publish" class="put">Publish Article 1</a> jQuery: $(document).ready(function() { $('.delete').click(function() { if(confirm('Are you sure?')) { var f = document.createElement('form'); $(this).after($(f).attr({ method: 'post', action: $(this).attr('href') }).append('<input type="hidden" name="_method" value="DELETE" />')); $(f).submit(); } return false; }); $('.put').click(function() { var f = document.createElement('form'); $(this).after($(f).attr({ method: 'post', action: $(this).attr('href') }).append('<input type="hidden" name="_method" value="PUT" />')); $(f).submit(); return false; }); });

    Read the article

  • WordPress update_post_meta values. Delete when empty or just test for ""?

    - by Scott B
    My function below, will take the values from my custom meta fields (after a post has been edited, and save or publish has been clicked) and update or insert the posted meta values. However, if the user leaves this field blank, I believe I want to delete the meta altogether (so I can test for its presence and display accordingly vs just checking for ""). For example, one of my meta options gives the user the ability to add a Custom title to their post, which when present, will populate the page's tag. However, if the field is left empty, I want to default the tag to the_title(), which is simply the Post title used to identify the page/post. Since I'm not deleting the meta on save, its always present after the first time a user enters something in there, get_post_meta($post-ID,'MyCustomTitle', true) is always true. Further, they cannot blank it out by clearing the title field and hitting publish. What am I missing in the save in order to clear the value to "" when the user clears the field? if ($_POST['MyCustomTitle']) { update_custom_meta($postID, $_POST['MyCustomTitle'], 'MyCustomTitle'); } function update_custom_meta($postID, $newvalue, $field_name) { // To create new meta if(!get_post_meta($postID, $field_name)){ add_post_meta($postID, $field_name, $newvalue); }else{ // or to update existing meta update_post_meta($postID, $field_name, $newvalue); } }

    Read the article

  • SOA Community Newsletter: nouvelle lettre !

    - by mseika
    SOA PARTNER COMMUNITY NEWSLETTERAUGUST 2012 Dear SOA partner community member Have you submitted your feedback on SOA Partner Community Survey 2012? This is the last chance to participate in the survey. We recommend you to complete the survey and help us to improve our SOA Community. Thanks to all attendees and trainers for their participation in the excellent Fusion Middleware Summer Camps held in Lisbon and Munich. I would also like to thank you for the great feedback and the nice reports provided by AMIS Technology Blog & Middleware by Link Consulting. Most of our courses have been overbooked, if you did not get a chance or missed it, we offer a wide range of online training and the course material. Key take-away from the advanced BPM course is to become an expert in ADF. Here is the course from Grant Ronald Learn Advanced ADF online available. The Link Consulting Team became experts in SOA Governance with EAMS and Oracle Enterprise Repository! We always encourage our community members to share their best practices and are very keen to publish it. Please let us know if you want to share your best practices through this medium.We encourage you to make use of the Specialization benefits - this month we are giving an opportunity to Promote Your SOA & BPM Events. Jürgen KressOracle SOA & BPM Partner Adoption EMEA NEW CONTENT Presentations & Training material OFM Summer CampsPromote Your SOA & BPM Events Advanced ADF Online, For Free By Grant BPM 11g Customer Stories & Solution Catalog & Process Accelerators Delivering SOA Governance with EAMS by Link Consulting Team WebLogic Server Provisioning and Patching News from our Partners & CommunityUpdated material by Oracle Connect and Network SOA Blogs SOA on Facebook SOA on LinkedIn SOA on Twitter Mix SOA Forum SOA Workspace PRESENTATIONS & TRAINING MATERIAL OFM SUMMER CAMPS Thanks to all attendees who invested their time and utilized the opportunity to attend the Summer Camps! Due to high demand of our most of the trainings, we had a long waiting list with more numbers of partners who are keen to attend it. We would like to give our special thanks to all trainers, who delivered excellent workshops! Most of the presentations and course material have been posted on our SOA Community Workspaceand WebLogic Community Workspace. You can access the content only if you are a registered community member. To register for the SOA Community please click here. You can register for the WebLogic Community here. To find out the first impressions of the event please visit our Facebook pages:www.facebook.com/WebLogicCommunity &www.facebook.com/soacommunity or Picasa AlbumThanks for the excellent blog posts from AMIS Technology Blog & Middleware by Link Consulting. Let us know if you published a twitter blog on@soacommunity & @wlscommunity. We will be pleased to publish it in our Newsletters. BPM Course Quotes “Its always easy, if you know, what you are doing” - Torsten Winterberg, Opitz“ The best ideas are the ideas from the best” - Filipe Sequeria, Primesoft “Best invest in the education in the last 12 months” - Richard Schaller, IPT “Practice best practice with the best instructor” - Graham Lamond Capgemini “If you have basic BPM knowledge, this is the course to really mater it” - Diogo Henriques Link Consulting “Very good trainers lot of work. Lot of fun as well” - Matthias Gris Workflow Factory “If you like to accelerate in Oracle come to the training to bring it all together” - Marcel van der Glind, Amis ADF Course Quotes "Excellent training, great opportunity to network!" - Frank Houweling, Amis "Lots of fun and good ideas" - Ana Santiago, GFI "Learn ADF, worth it Fusion Apps is the future" - Miguel Delgadillo, STO Consulting "The best way to learn Fusion Middleware from the #1" Alexandro Montantes, STO Consulting "Be advanced to to be the first” - Dimitar Petrov Fadata "Great opportunity to suck all the knowledge out of some very experienced product managers” - Wilfred von der Deijl, The Future Group WebLogic Course Quotes “Oracle trainings are the best” - Pedro Neto Novobas“ "Excellent training, well organized” - Pedro Antunh, Capgemini “This course dives you into Oracle WebLogic giving you a quick start on benefiting from Fusion Apps” - Leonardo Fernandes, Outsystems Additional Quotes “Thanks a lot again for organizing such a great and informative Summer Camp. Both training and networking were organized very professionally. I have gained tons of very useful Info, which will definitely help to increase quality of our future projects.” - Daniel Fasko fss-group.com I didn’t get the chance yesterday to thank you for a most enjoyable and thoroughly educational time I had in Munich over the last few days.” - Jeroen Bakker Ordina “Just to congratulate you on a great event, not only today but also in the previous days of training. As we know, a very good organization and, as a native Portuguese that knows Lisbon very good, a nice choice of places to visit. Looking forward to come again next year.” Pedro Miguel Neto, Novobase PROMOTE YOUR SOA & BPM EVENTS The Partner Event Publisher has just been made available to all SOA & BPM specialized partners in EMEA. Partners now have the opportunity to publish their events to theOracle.com/events site and spread the word on their upcoming live in-person and/or live webcast events. See the demo below and click here to read more information. ADVANCED ADF ONLINE, FOR FREE BY GRANT The second part of the advanced ADF online eCourse is Live now! This covers the advanced topics of region and region interaction as well as getting down and dirty with some of the layout features of ADF Faces, skinning and DVT components. The aim of this course is to give you a self-paced learning aid which covers the more advanced topics of ADF development. The content is developed by Product Management and our Curriculum development teams and is based on advanced training material we have been running internally for about 18 months. We will get started on the next chapter, but in the meantime, please have a look at chapters one and two. Back to top BPM 11G CUSTOMER STORIES & SOLUTION CATALOG & PROCESS ACCELERATORS Stories Everyone loves a good story on planning or implementing a BPM strategy. Everyone wants to hear how it was done before?, what worked?, what was achieved? If you have achieved success with BPM, we are very keen to hear your stories and examples of how your customers use it. We receive lots of requests from people who are thinking of using BPM to solve a specific problem or in combination with a specific technology to talk to someone who has done it before. These stories are invaluable. Drop down the details of anything you think is relevant with a bit of detail and we will follow up on it. As one good deed deserves another, we will do our best to give you stories if you need them to show that where you are going, others have treaded before. Send your stories to us using this e-mail link and we will share them among other like minded people. Solution Catalogue This summer, Oracle is launching a solution catalogue specifically intended for partners. If you have delivered a successful implementation in BPM and think it could be reused and applied again in a similar scenario in the same industry or in a similar environment, then we ware keen to know about it and will add it to the solution catalogue. The solution catalogue will showcase successful BPM solutions both inside and outside Oracle. Be in touch with us on this e-mail link and we will make sure to add your solution. Process AcceleratorsFinally if you have specific processes that you are expert on, you have implemented at a customer and you want to work with us on getting these productised, then we would love to know about it. The process accelerator programme is explained in the most recent SOA/BPM Community Newsletter but again feel free to contact us if you want to get involved. Good luck with BPM and let us know how we can help. Barry O'Reilly Director BPM [email protected] DELIVERING SOA GOVERNANCE WITH EAMS BY LINK CONSULTING TEAM In the last 12 years Link Consulting has been making its presence in specific areas such as Governance and Architecture, both in terms of practices and methodologies, products, know-how and technological expertise. The Enterprise Architecture Management System - Oracle Enterprise Edition (EAMS - OER Edition) is the result of this experience and combines the architecture management solution with OER in order to deliver a product specialized for SOA Governance that gathers the better of two worlds in solution that enables SOA Governance projects, initiatives and programs. Enterprise Architecture Management System Enterprise Architecture Management System (EAMS), is an automation based solution that enables the efficient management of Enterprise Architectures. The solution uses configured enterprise repositories and takes advantages of its features to provide automation capabilities to the users. EAMS provides capabilities to create/customize/analyze repository data, architectural blueprints, reports and analytic charts. Oracle Enterprise Repository Oracle Enterprise Repository (OER) is one of the major and central elements of the Oracle SOA Governance solution. Oracle Enterprise Repository provides the tools to manage and govern the metadata for any type of software asset, from business processes and services to patterns, frameworks, applications, components, and models. OER maps the relationships and inter-dependencies that connect those assets to improve impact analysis, promote and optimize their reuse, and measure their impact on the bottom line. It provides the visibility, feedback, controls, and analytics to keep your SOA on track to deliver business value. The intense focus on automation helps to overcome barriers to SOA adoption and streamline governance throughout the lifecycle. Core capabilities of the OER include: Asset Management Asset Lifecycle Management Usage Tracking Service Discovery Version Management Dependency Analysis Portfolio Management EAMS - OER Edition The solution takes the advantages and features from both products and combines them in a symbiotic tool that enhances the quality of SOA Governance Initiatives and Programs. EAMS is able to produce a vast number of outputs by combining its analytical engine, SOA-specific configurations and the assets in OER and other related tools, catalogs and repositories. The configurations encompass not only the extendable parametrization of the metadata but also fully configurable blueprints, PowerPoint reports, charts and queries. The SOA blueprints The solution comes with a set of predefined architectural representations that help the organization better perceive their SOA landscape. More blueprints can be easily created in order to accommodate the organizations needs in terms of detail, audience and metadata. Charts & Dashboards The solution encompasses a set of predefined charts and dashboards that promote a more agile way to control and explore the assets. Time Based Visualization All representations are time bound, and with EAMS - OER you can truly govern SOA with a complete view of the Past, Present and Future; The solution delivers Gap Analysis, a project oriented approach while taking into consideration the As-Was, As-Is an To-Be. Time based visualization differentiating factors: Extensive automation and maintenance of architectural representations Organization wide solution. Easy access and navigation to and between all architectural artifacts and representations. Flexible meta-model, customization and extensibility capabilities. Lifecycle management and enforcement of the time dimension over all the repository content. Profile based customization. Comprehensive visibility Architectural alignment Friendly and striking user interfaces For more information on EAMS visit us here. For more information on SOA visit us here. WEBLOGIC SERVER PROVISIONING AND PATCHING For access to the Oracle demo systems please visit OPN and talk to your Partner Expert.SOA Suite and BPM Suite runs on WebLogic! We are pleased to announce the availability of a WebLogic Server Management demo that showcases some of the key provisioning and patching capabilities of WebLogic Server Management Pack Enterprise Edition (EE). To learn more about these features - as well as other features of the pack - please visit the pack's saleskit page.Demo Highlights The demo showcases the following capabilities: Patching Oracle WebLogic Servers Standardizing WebLogic Server Patch Rollouts Creating a WebLogic Domain Provisioning Profile Cloning a WebLogic Domain from a Provisioning Profile Deploying a Java EE Application Scaling Out an Oracle WebLogic Cluster Demo Instructions Go to the DSS website for Oracle Partners. On the Standard Demo Launchpad page, under the “Software Lifecycle Automation” section, click on the link “EM Cloud Control 12c WLS Provisioning and Patching” (tagged as “NEW”). Specific demo launchpad page contains a link to the detailed demo script with instructions on how to show the demo.

    Read the article

  • Windows Azure: Announcing release of Windows Azure SDK 2.2 (with lots of goodies)

    - by ScottGu
    Earlier today I blogged about a big update we made today to Windows Azure, and some of the great new features it provides. Today I’m also excited to also announce the release of the Windows Azure SDK 2.2. Today’s SDK release adds even more great features including: Visual Studio 2013 Support Integrated Windows Azure Sign-In support within Visual Studio Remote Debugging Cloud Services with Visual Studio Firewall Management support within Visual Studio for SQL Databases Visual Studio 2013 RTM VM Images for MSDN Subscribers Windows Azure Management Libraries for .NET Updated Windows Azure PowerShell Cmdlets and ScriptCenter The below post has more details on what’s available in today’s Windows Azure SDK 2.2 release.  Also head over to Channel 9 to see the new episode of the Visual Studio Toolbox show that will be available shortly, and which highlights these features in a video demonstration. Visual Studio 2013 Support Version 2.2 of the Window Azure SDK is the first official version of the SDK to support the final RTM release of Visual Studio 2013. If you installed the 2.1 SDK with the Preview of Visual Studio 2013 we recommend that you upgrade your projects to SDK 2.2.  SDK 2.2 also works side by side with the SDK 2.0 and SDK 2.1 releases on Visual Studio 2012: Integrated Windows Azure Sign In within Visual Studio Integrated Windows Azure Sign-In support within Visual Studio is one of the big improvements added with this Windows Azure SDK release.  Integrated sign-in support enables developers to develop/test/manage Windows Azure resources within Visual Studio without having to download or use management certificates.  You can now just right-click on the “Windows Azure” icon within the Server Explorer inside Visual Studio and choose the “Connect to Windows Azure” context menu option to connect to Windows Azure: Doing this will prompt you to enter the email address of the account you wish to sign-in with: You can use either a Microsoft Account (e.g. Windows Live ID) or an Organizational account (e.g. Active Directory) as the email.  The dialog will update with an appropriate login prompt depending on which type of email address you enter: Once you sign-in you’ll see the Windows Azure resources that you have permissions to manage show up automatically within the Visual Studio Server Explorer (and you can start using them): With this new integrated sign in experience you are now able to publish web apps, deploy VMs and cloud services, use Windows Azure diagnostics, and fully interact with your Windows Azure services within Visual Studio without the need for a management certificate.  All of the authentication is handled using the Windows Azure Active Directory associated with your Windows Azure account (details on this can be found in my earlier blog post). Integrating authentication this way end-to-end across the Service Management APIs + Dev Tools + Management Portal + PowerShell automation scripts enables a much more secure and flexible security model within Windows Azure, and makes it much more convenient to securely manage multiple developers + administrators working on a project.  It also allows organizations and enterprises to use the same authentication model that they use for their developers on-premises in the cloud.  It also ensures that employees who leave an organization immediately lose access to their company’s cloud based resources once their Active Directory account is suspended. Filtering/Subscription Management Once you login within Visual Studio, you can filter which Windows Azure subscriptions/regions are visible within the Server Explorer by right-clicking the “Filter Services” context menu within the Server Explorer.  You can also use the “Manage Subscriptions” context menu to mange your Windows Azure Subscriptions: Bringing up the “Manage Subscriptions” dialog allows you to see which accounts you are currently using, as well as which subscriptions are within them: The “Certificates” tab allows you to continue to import and use management certificates to manage Windows Azure resources as well.  We have not removed any functionality with today’s update – all of the existing scenarios that previously supported management certificates within Visual Studio continue to work just fine.  The new integrated sign-in support provided with today’s release is purely additive. Note: the SQL Database node and the Mobile Service node in Server Explorer do not support integrated sign-in at this time. Therefore, you will only see databases and mobile services under those nodes if you have a management certificate to authorize access to them.  We will enable them with integrated sign-in in a future update. Remote Debugging Cloud Resources within Visual Studio Today’s Windows Azure SDK 2.2 release adds support for remote debugging many types of Windows Azure resources. With live, remote debugging support from within Visual Studio, you are now able to have more visibility than ever before into how your code is operating live in Windows Azure.  Let’s walkthrough how to enable remote debugging for a Cloud Service: Remote Debugging of Cloud Services To enable remote debugging for your cloud service, select Debug as the Build Configuration on the Common Settings tab of your Cloud Service’s publish dialog wizard: Then click the Advanced Settings tab and check the Enable Remote Debugging for all roles checkbox: Once your cloud service is published and running live in the cloud, simply set a breakpoint in your local source code: Then use Visual Studio’s Server Explorer to select the Cloud Service instance deployed in the cloud, and then use the Attach Debugger context menu on the role or to a specific VM instance of it: Once the debugger attaches to the Cloud Service, and a breakpoint is hit, you’ll be able to use the rich debugging capabilities of Visual Studio to debug the cloud instance remotely, in real-time, and see exactly how your app is running in the cloud. Today’s remote debugging support is super powerful, and makes it much easier to develop and test applications for the cloud.  Support for remote debugging Cloud Services is available as of today, and we’ll also enable support for remote debugging Web Sites shortly. Firewall Management Support with SQL Databases By default we enable a security firewall around SQL Databases hosted within Windows Azure.  This ensures that only your application (or IP addresses you approve) can connect to them and helps make your infrastructure secure by default.  This is great for protection at runtime, but can sometimes be a pain at development time (since by default you can’t connect/manage the database remotely within Visual Studio if the security firewall blocks your instance of VS from connecting to it). One of the cool features we’ve added with today’s release is support that makes it easy to enable and configure the security firewall directly within Visual Studio.  Now with the SDK 2.2 release, when you try and connect to a SQL Database using the Visual Studio Server Explorer, and a firewall rule prevents access to the database from your machine, you will be prompted to add a firewall rule to enable access from your local IP address: You can simply click Add Firewall Rule and a new rule will be automatically added for you. In some cases, the logic to detect your local IP may not be sufficient (for example: you are behind a corporate firewall that uses a range of IP addresses) and you may need to set up a firewall rule for a range of IP addresses in order to gain access. The new Add Firewall Rule dialog also makes this easy to do.  Once connected you’ll be able to manage your SQL Database directly within the Visual Studio Server Explorer: This makes it much easier to work with databases in the cloud. Visual Studio 2013 RTM Virtual Machine Images Available for MSDN Subscribers Last week we released the General Availability Release of Visual Studio 2013 to the web.  This is an awesome release with a ton of new features. With today’s Windows Azure update we now have a set of pre-configured VM images of VS 2013 available within the Windows Azure Management Portal for use by MSDN customers.  This enables you to create a VM in the cloud with VS 2013 pre-installed on it in with only a few clicks: Windows Azure now provides the fastest and easiest way to get started doing development with Visual Studio 2013. Windows Azure Management Libraries for .NET (Preview) Having the ability to automate the creation, deployment, and tear down of resources is a key requirement for applications running in the cloud.  It also helps immensely when running dev/test scenarios and coded UI tests against pre-production environments. Today we are releasing a preview of a new set of Windows Azure Management Libraries for .NET.  These new libraries make it easy to automate tasks using any .NET language (e.g. C#, VB, F#, etc).  Previously this automation capability was only available through the Windows Azure PowerShell Cmdlets or to developers who were willing to write their own wrappers for the Windows Azure Service Management REST API. Modern .NET Developer Experience We’ve worked to design easy-to-understand .NET APIs that still map well to the underlying REST endpoints, making sure to use and expose the modern .NET functionality that developers expect today: Portable Class Library (PCL) support targeting applications built for any .NET Platform (no platform restriction) Shipped as a set of focused NuGet packages with minimal dependencies to simplify versioning Support async/await task based asynchrony (with easy sync overloads) Shared infrastructure for common error handling, tracing, configuration, HTTP pipeline manipulation, etc. Factored for easy testability and mocking Built on top of popular libraries like HttpClient and Json.NET Below is a list of a few of the management client classes that are shipping with today’s initial preview release: .NET Class Name Supports Operations for these Assets (and potentially more) ManagementClient Locations Credentials Subscriptions Certificates ComputeManagementClient Hosted Services Deployments Virtual Machines Virtual Machine Images & Disks StorageManagementClient Storage Accounts WebSiteManagementClient Web Sites Web Site Publish Profiles Usage Metrics Repositories VirtualNetworkManagementClient Networks Gateways Automating Creating a Virtual Machine using .NET Let’s walkthrough an example of how we can use the new Windows Azure Management Libraries for .NET to fully automate creating a Virtual Machine. I’m deliberately showing a scenario with a lot of custom options configured – including VHD image gallery enumeration, attaching data drives, network endpoints + firewall rules setup - to show off the full power and richness of what the new library provides. We’ll begin with some code that demonstrates how to enumerate through the built-in Windows images within the standard Windows Azure VM Gallery.  We’ll search for the first VM image that has the word “Windows” in it and use that as our base image to build the VM from.  We’ll then create a cloud service container in the West US region to host it within: We can then customize some options on it such as setting up a computer name, admin username/password, and hostname.  We’ll also open up a remote desktop (RDP) endpoint through its security firewall: We’ll then specify the VHD host and data drives that we want to mount on the Virtual Machine, and specify the size of the VM we want to run it in: Once everything has been set up the call to create the virtual machine is executed asynchronously In a few minutes we’ll then have a completely deployed VM running on Windows Azure with all of the settings (hard drives, VM size, machine name, username/password, network endpoints + firewall settings) fully configured and ready for us to use: Preview Availability via NuGet The Windows Azure Management Libraries for .NET are now available via NuGet. Because they are still in preview form, you’ll need to add the –IncludePrerelease switch when you go to retrieve the packages. The Package Manager Console screen shot below demonstrates how to get the entire set of libraries to manage your Windows Azure assets: You can also install them within your .NET projects by right clicking on the VS Solution Explorer and using the Manage NuGet Packages context menu command.  Make sure to select the “Include Prerelease” drop-down for them to show up, and then you can install the specific management libraries you need for your particular scenarios: Open Source License The new Windows Azure Management Libraries for .NET make it super easy to automate management operations within Windows Azure – whether they are for Virtual Machines, Cloud Services, Storage Accounts, Web Sites, and more.  Like the rest of the Windows Azure SDK, we are releasing the source code under an open source (Apache 2) license and it is hosted at https://github.com/WindowsAzure/azure-sdk-for-net/tree/master/libraries if you wish to contribute. PowerShell Enhancements and our New Script Center Today, we are also shipping Windows Azure PowerShell 0.7.0 (which is a separate download). You can find the full change log here. Here are some of the improvements provided with it: Windows Azure Active Directory authentication support Script Center providing many sample scripts to automate common tasks on Windows Azure New cmdlets for Media Services and SQL Database Script Center Windows Azure enables you to script and automate a lot of tasks using PowerShell.  People often ask for more pre-built samples of common scenarios so that they can use them to learn and tweak/customize. With this in mind, we are excited to introduce a new Script Center that we are launching for Windows Azure. You can learn about how to scripting with Windows Azure with a get started article. You can then find many sample scripts across different solutions, including infrastructure, data management, web, and more: All of the sample scripts are hosted on TechNet with links from the Windows Azure Script Center. Each script is complete with good code comments, detailed descriptions, and examples of usage. Summary Visual Studio 2013 and the Windows Azure SDK 2.2 make it easier than ever to get started developing rich cloud applications. Along with the Windows Azure Developer Center’s growing set of .NET developer resources to guide your development efforts, today’s Windows Azure SDK 2.2 release should make your development experience more enjoyable and efficient. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >