Search Results

Search found 4244 results on 170 pages for 'nocturnal thoughts'.

Page 170/170 | < Previous Page | 166 167 168 169 170 

  • Casting an object using 'as' returns null: myObject = newObject as MyObject; // null

    - by John Russell
    I am trying to create a custom object in AS3 to pass information to and from a server, which in this case will be Red5. In the below screenshots you will see that I am able to send a request for an object from as3, and receive it successfully from the java server. However, when I try to cast the received object to my defined objectType using 'as', it takes the value of null. It is my understanding that that when using "as" your checking to see if your variable is a member of the specified data type. If the variable is not, then null will be returned. This screenshot illustrates that I am have successfully received my object 'o' from red5 and I am just about to cast it to the (supposedly) identical datatype testObject of LobbyData: However, when testObject = o as LobbyData; runs, it returns null. :( Below you will see my specifications both on the java server and the as3 client. I am confident that both objects are identical in every way, but for some reason flash does not think so. I have been pulling my hair out for a long time, does anyone have any thoughts? AS3 Object: import flash.utils.IDataInput; import flash.utils.IDataOutput; import flash.utils.IExternalizable; import flash.net.registerClassAlias; [Bindable] [RemoteClass(alias = "myLobbyData.LobbyData")] public class LobbyData implements IExternalizable { private var sent:int; // java sentinel private var u:String; // red5 username private var sen:int; // another sentinel? private var ui:int; // fb uid private var fn:String; // fb name private var pic:String; // fb pic private var inb:Boolean; // is in the table? private var t:int; // table number private var s:int; // seat number public function setSent(sent:int):void { this.sent = sent; } public function getSent():int { return sent; } public function setU(u:String):void { this.u = u; } public function getU():String { return u; } public function setSen(sen:int):void { this.sen = sen; } public function getSen():int { return sen; } public function setUi(ui:int):void { this.ui = ui; } public function getUi():int { return ui; } public function setFn(fn:String):void { this.fn = fn; } public function getFn():String { return fn; } public function setPic(pic:String):void { this.pic = pic; } public function getPic():String { return pic; } public function setInb(inb:Boolean):void { this.inb = inb; } public function getInb():Boolean { return inb; } public function setT(t:int):void { this.t = t; } public function getT():int { return t; } public function setS(s:int):void { this.s = s; } public function getS():int { return s; } public function readExternal(input:IDataInput):void { sent = input.readInt(); u = input.readUTF(); sen = input.readInt(); ui = input.readInt(); fn = input.readUTF(); pic = input.readUTF(); inb = input.readBoolean(); t = input.readInt(); s = input.readInt(); } public function writeExternal(output:IDataOutput):void { output.writeInt(sent); output.writeUTF(u); output.writeInt(sen); output.writeInt(ui); output.writeUTF(fn); output.writeUTF(pic); output.writeBoolean(inb); output.writeInt(t); output.writeInt(s); } } Java Object: package myLobbyData; import org.red5.io.amf3.IDataInput; import org.red5.io.amf3.IDataOutput; import org.red5.io.amf3.IExternalizable; public class LobbyData implements IExternalizable { private static final long serialVersionUID = 115280920; private int sent; // java sentinel private String u; // red5 username private int sen; // another sentinel? private int ui; // fb uid private String fn; // fb name private String pic; // fb pic private Boolean inb; // is in the table? private int t; // table number private int s; // seat number public void setSent(int sent) { this.sent = sent; } public int getSent() { return sent; } public void setU(String u) { this.u = u; } public String getU() { return u; } public void setSen(int sen) { this.sen = sen; } public int getSen() { return sen; } public void setUi(int ui) { this.ui = ui; } public int getUi() { return ui; } public void setFn(String fn) { this.fn = fn; } public String getFn() { return fn; } public void setPic(String pic) { this.pic = pic; } public String getPic() { return pic; } public void setInb(Boolean inb) { this.inb = inb; } public Boolean getInb() { return inb; } public void setT(int t) { this.t = t; } public int getT() { return t; } public void setS(int s) { this.s = s; } public int getS() { return s; } @Override public void readExternal(IDataInput input) { sent = input.readInt(); u = input.readUTF(); sen = input.readInt(); ui = input.readInt(); fn = input.readUTF(); pic = input.readUTF(); inb = input.readBoolean(); t = input.readInt(); s = input.readInt(); } @Override public void writeExternal(IDataOutput output) { output.writeInt(sent); output.writeUTF(u); output.writeInt(sen); output.writeInt(ui); output.writeUTF(fn); output.writeUTF(pic); output.writeBoolean(inb); output.writeInt(t); output.writeInt(s); } } AS3 Client: public function refreshRoom(event:Event) { var resp:Responder=new Responder(handleResp,null); ncLobby.call("getLobbyData", resp, null); } public function handleResp(o:Object):void { var testObject:LobbyData=new LobbyData; testObject = o as LobbyData; trace(testObject); } Java Client public LobbyData getLobbyData(String param) { LobbyData lobbyData1 = new LobbyData(); lobbyData1.setSent(5); lobbyData1.setU("lawlcats"); lobbyData1.setSen(5); lobbyData1.setUi(5); lobbyData1.setFn("lulz"); lobbyData1.setPic("lulzagain"); lobbyData1.setInb(true); lobbyData1.setT(5); lobbyData1.setS(5); return lobbyData1; }

    Read the article

  • Null-free "maps": Is a callback solution slower than tryGet()?

    - by David Moles
    In comments to "How to implement List, Set, and Map in null free design?", Steven Sudit and I got into a discussion about using a callback, with handlers for "found" and "not found" situations, vs. a tryGet() method, taking an out parameter and returning a boolean indicating whether the out parameter had been populated. Steven maintained that the callback approach was more complex and almost certain to be slower; I maintained that the complexity was no greater and the performance at worst the same. But code speaks louder than words, so I thought I'd implement both and see what I got. The original question was fairly theoretical with regard to language ("And for argument sake, let's say this language don't even have null") -- I've used Java here because that's what I've got handy. Java doesn't have out parameters, but it doesn't have first-class functions either, so style-wise, it should suck equally for both approaches. (Digression: As far as complexity goes: I like the callback design because it inherently forces the user of the API to handle both cases, whereas the tryGet() design requires callers to perform their own boilerplate conditional check, which they could forget or get wrong. But having now implemented both, I can see why the tryGet() design looks simpler, at least in the short term.) First, the callback example: class CallbackMap<K, V> { private final Map<K, V> backingMap; public CallbackMap(Map<K, V> backingMap) { this.backingMap = backingMap; } void lookup(K key, Callback<K, V> handler) { V val = backingMap.get(key); if (val == null) { handler.handleMissing(key); } else { handler.handleFound(key, val); } } } interface Callback<K, V> { void handleFound(K key, V value); void handleMissing(K key); } class CallbackExample { private final Map<String, String> map; private final List<String> found; private final List<String> missing; private Callback<String, String> handler; public CallbackExample(Map<String, String> map) { this.map = map; found = new ArrayList<String>(map.size()); missing = new ArrayList<String>(map.size()); handler = new Callback<String, String>() { public void handleFound(String key, String value) { found.add(key + ": " + value); } public void handleMissing(String key) { missing.add(key); } }; } void test() { CallbackMap<String, String> cbMap = new CallbackMap<String, String>(map); for (int i = 0, count = map.size(); i < count; i++) { String key = "key" + i; cbMap.lookup(key, handler); } System.out.println(found.size() + " found"); System.out.println(missing.size() + " missing"); } } Now, the tryGet() example -- as best I understand the pattern (and I might well be wrong): class TryGetMap<K, V> { private final Map<K, V> backingMap; public TryGetMap(Map<K, V> backingMap) { this.backingMap = backingMap; } boolean tryGet(K key, OutParameter<V> valueParam) { V val = backingMap.get(key); if (val == null) { return false; } valueParam.value = val; return true; } } class OutParameter<V> { V value; } class TryGetExample { private final Map<String, String> map; private final List<String> found; private final List<String> missing; public TryGetExample(Map<String, String> map) { this.map = map; found = new ArrayList<String>(map.size()); missing = new ArrayList<String>(map.size()); } void test() { TryGetMap<String, String> tgMap = new TryGetMap<String, String>(map); for (int i = 0, count = map.size(); i < count; i++) { String key = "key" + i; OutParameter<String> out = new OutParameter<String>(); if (tgMap.tryGet(key, out)) { found.add(key + ": " + out.value); } else { missing.add(key); } } System.out.println(found.size() + " found"); System.out.println(missing.size() + " missing"); } } And finally, the performance test code: public static void main(String[] args) { int size = 200000; Map<String, String> map = new HashMap<String, String>(); for (int i = 0; i < size; i++) { String val = (i % 5 == 0) ? null : "value" + i; map.put("key" + i, val); } long totalCallback = 0; long totalTryGet = 0; int iterations = 20; for (int i = 0; i < iterations; i++) { { TryGetExample tryGet = new TryGetExample(map); long tryGetStart = System.currentTimeMillis(); tryGet.test(); totalTryGet += (System.currentTimeMillis() - tryGetStart); } System.gc(); { CallbackExample callback = new CallbackExample(map); long callbackStart = System.currentTimeMillis(); callback.test(); totalCallback += (System.currentTimeMillis() - callbackStart); } System.gc(); } System.out.println("Avg. callback: " + (totalCallback / iterations)); System.out.println("Avg. tryGet(): " + (totalTryGet / iterations)); } On my first attempt, I got 50% worse performance for callback than for tryGet(), which really surprised me. But, on a hunch, I added some garbage collection, and the performance penalty vanished. This fits with my instinct, which is that we're basically talking about taking the same number of method calls, conditional checks, etc. and rearranging them. But then, I wrote the code, so I might well have written a suboptimal or subconsicously penalized tryGet() implementation. Thoughts?

    Read the article

  • Using jQuery to slideToggle a group of Table Rows

    - by CCC
    Hi. I'm fairly new to javaScript and jQuery, so hopefully this will be a quick fix. I need to display a table containing data that can be grouped into several categories, and I'd like to implement a slideToggle that hides/reveals all of the observations in each given category. The code below should (ideally) display a table with 4 columns and 9 rows, with every group of 3 rows preceded by a green "Section i" row. I would like each Section header to work as a slideToggle that expands or collapses all of the rows beneath it. Right now, nothing is collapsing. Any thoughts? <head> <style type="text/css"> td{padding:5px;} </style> <script type="text/javascript" src="js/jquery.js"></script> <script type="text/javascript"> $(document).ready(function(){ $(".flip").click(function(){ $(this).next(".section").slideToggle(); }); }); </script> </head> <body> <p> <table id="main_table"> <thead> <tr class="firstline"> <th>Column1</th> <th>Column2</th> <th>Column3</th> <th>Column4</th> </tr> </thead> <tbody> <tr style="background-color:green; color:white"> <td colspan="4" class="flip"> Section 1 </td> </tr> <div class="section"> <tr> <td>item 111</td> <td>item 112</td> <td>item 113</td> <td>item 114</td> </tr> <tr> <td>item 121</td> <td>item 122</td> <td>item 123</td> <td>item 124</td> </tr> <tr> <td>item 131</td> <td>item 132</td> <td>item 133</td> <td>item 134</td> </tr> </div> <tr style="background-color:green; color:white"> <td colspan="4" class="flip"> Section 2 </td> </tr> <div class="section"> <tr> <td>item 211</td> <td>item 212</td> <td>item 213</td> <td>item 214</td> </tr> <tr> <td>item 221</td> <td>item 222</td> <td>item 223</td> <td>item 224</td> </tr> <tr> <td>item 231</td> <td>item 232</td> <td>item 233</td> <td>item 234</td> </tr> </div> <tr style="background-color:green; color:white"> <td colspan="4" class="flip"> Section 3 </td> </tr> <div class="section"> <tr> <td>item 311</td> <td>item 312</td> <td>item 313</td> <td>item 314</td> </tr> <tr> <td>item 321</td> <td>item 322</td> <td>item 323</td> <td>item 324</td> </tr> <tr> <td>item 331</td> <td>item 332</td> <td>item 333</td> <td>item 334</td> </tr> </div> </tbody> </table> </p> </body>

    Read the article

  • C socket programming: select() is returning 0 despite messages sent from server

    - by Fantastic Fourier
    Hey all, I'm using select() to recv() messages from server, using TCP/IP. When I send() messages from the server, it returns a reasonable number of bytes, saying it's sent successful. And it does get to the client successfully when I use while loop to just recv(). Everything is fine and dandy. while(1) recv() // obviously pseudocode However, when I try to use select(), select() returns 0 from timeout (which is set to 1 second) and for the life of me I cannot figure out why it doesn't see the messages sent from the server. I should also mention that when the server disconnects, select() doesn't see that either, where as if I were to use recv(), it would return 0 to indicate that the connection using the socket has been closed. Any inputs or thoughts are deeply appreciated. #include <arpa/inet.h> #include <errno.h> #include <fcntl.h> #include <netdb.h> #include <netinet/in.h> #include <pthread.h> #include <stdio.h> #include <stdlib.h> #include <string.h> #include <strings.h> #include <sys/select.h> #include <sys/socket.h> #include <sys/time.h> #include <sys/types.h> #include <time.h> #include <unistd.h> #define SERVER_PORT 10000 #define MAX_CONNECTION 20 #define MAX_MSG 50 struct client { char c_name[MAX_MSG]; char g_name[MAX_MSG]; int csock; int host; // 0 = not host of a multicast group struct sockaddr_in client_address; struct client * next_host; struct client * next_client; }; struct fd_info { char c_name[MAX_MSG]; int socks_inuse[MAX_CONNECTION]; int sock_fd, max_fd; int exit; struct client * c_sys; struct sockaddr_in c_address[MAX_CONNECTION]; struct sockaddr_in server_address; struct sockaddr_in client_address; fd_set read_set; }; struct message { char c_name[MAX_MSG]; char g_name[MAX_MSG]; char _command[3][MAX_MSG]; char _payload[MAX_MSG]; struct sockaddr_in client_address; struct client peer; }; int main(int argc, char * argv[]) { char * host; char * temp; int i, sockfd; int msg_len, rv, ready; int connection, management, socketread; int sockfds[MAX_CONNECTION]; // for three threads that handle new connections, user inputs and select() for sockets pthread_t connection_handler, manager, socket_reader; struct sockaddr_in server_address, client_address; struct hostent * hserver, cserver; struct timeval timeout; struct message msg; struct fd_info info; info.exit = 0; // exit information: if exit = 1, threads quit info.c_sys = NULL; // looking up from the host database if (argc == 3) { host = argv[1]; // server address strncpy(info.c_name, argv[2], strlen(argv[2])); // client name } else { printf("plz read the manual, kthxbai\n"); exit(1); } printf("host is %s and hp is %p\n", host, hserver); hserver = gethostbyname(host); if (hserver) { printf("host found: %s\n", hserver->h_name ); } else { printf("host not found\n"); exit(1); } // setting up address and port structure information on serverside bzero((char * ) &server_address, sizeof(server_address)); // copy zeroes into string server_address.sin_family = AF_INET; memcpy(&server_address.sin_addr, hserver->h_addr, hserver->h_length); server_address.sin_port = htons(SERVER_PORT); bzero((char * ) &client_address, sizeof(client_address)); // copy zeroes into string client_address.sin_family = AF_INET; client_address.sin_addr.s_addr = htonl(INADDR_ANY); client_address.sin_port = htons(SERVER_PORT); // opening up socket sockfd = socket(AF_INET, SOCK_STREAM, 0); if (sockfd < 0) exit(1); else { printf("socket is opened: %i \n", sockfd); info.sock_fd = sockfd; } // sets up time out option for the bound socket timeout.tv_sec = 1; // seconds timeout.tv_usec = 0; // micro seconds ( 0.5 seconds) setsockopt(sockfd, SOL_SOCKET, SO_RCVTIMEO, &timeout, sizeof(struct timeval)); // binding socket to a port rv = bind(sockfd, (struct sockaddr *) &client_address, sizeof(client_address)); if (rv < 0) { printf("MAIN: ERROR bind() %i: %s\n", errno, strerror(errno)); exit(1); } else printf("socket is bound\n"); printf("MAIN: %li \n", client_address.sin_addr.s_addr); // connecting rv = connect(sockfd, (struct sockaddr *) &server_address, sizeof(server_address)); info.server_address = server_address; info.client_address = client_address; info.sock_fd = sockfd; info.max_fd = sockfd; printf("rv = %i\n", rv); if (rv < 0) { printf("MAIN: ERROR connect() %i: %s\n", errno, strerror(errno)); exit(1); } else printf("connected\n"); fd_set readset; FD_ZERO(&readset); FD_ZERO(&info.read_set); FD_SET(info.sock_fd, &info.read_set); while(1) { readset = info.read_set; printf("MAIN: %i \n", readset); ready = select((info.max_fd)+1, &readset, NULL, NULL, &timeout); if(ready == -1) { sleep(2); printf("TEST: MAIN: ready = -1. %s \n", strerror(errno)); } else if (ready == 0) { sleep(2); printf("TEST: MAIN: ready = 0. %s \n", strerror(errno)); } else if (ready > 0) { printf("TEST: MAIN: ready = %i. %s at socket %i \n", ready, strerror(errno), i); for(i = 0; i < ((info.max_fd)+1); i++) { if(FD_ISSET(i, &readset)) { rv = recv(sockfd, &msg, 500, 0); if(rv < 0) continue; else if(rv > 0) printf("MAIN: TEST: %s %s \n", msg._command[0], msg._payload); else if (rv == 0) { sleep(3); printf("MAIN: TEST: SOCKET CLOSEDDDDDD \n"); } FD_CLR(i, &readset); } } } info.read_set = readset; } // close connection close(sockfd); printf("socket closed. BYE! \n"); return(0); }

    Read the article

  • Passing ActionListeners in Java, pack()

    - by Crystal
    Two questions. First question is I'm trying to create a simple form that when you press a button, it adds a Person object to the ArrayList. However, since I am not used to GUIs, I tried creating one and am first just trying to get the user input from the JTextField, create an ActionListener object of the appropriate type, so once that works, then I can pass in all the JTextField inputs to create my Person object. Unfortunately, I am not getting any data when I type in something to the firstName JTextField and was wondering if someone could look at my code below. import java.awt.*; import java.awt.event.*; import javax.swing.*; import java.util.List; import java.util.ArrayList; public class AddressBook { public static void main(String[] args) { EventQueue.invokeLater(new Runnable() { public void run() { AddressBookFrame frame = new AddressBookFrame(); frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); JMenuBar menuBar = new JMenuBar(); frame.setJMenuBar(menuBar); JMenu fileMenu = new JMenu("File"); JMenuItem openItem = new JMenuItem("Open"); JMenuItem saveItem = new JMenuItem("Save"); JMenuItem saveAsItem = new JMenuItem("Save As"); JMenuItem printItem = new JMenuItem("Print"); JMenuItem exitItem = new JMenuItem("Exit"); fileMenu.add(openItem); fileMenu.add(saveItem); fileMenu.add(saveAsItem); fileMenu.add(printItem); fileMenu.add(exitItem); menuBar.add(fileMenu); JMenu editMenu = new JMenu("Edit"); JMenuItem newItem = new JMenuItem("New"); JMenuItem editItem = new JMenuItem("Edit"); JMenuItem deleteItem = new JMenuItem("Delete"); JMenuItem findItem = new JMenuItem("Find"); JMenuItem firstItem = new JMenuItem("First"); JMenuItem previousItem = new JMenuItem("Previous"); JMenuItem nextItem = new JMenuItem("Next"); JMenuItem lastItem = new JMenuItem("Last"); editMenu.add(newItem); editMenu.add(editItem); editMenu.add(deleteItem); editMenu.add(findItem); editMenu.add(firstItem); editMenu.add(previousItem); editMenu.add(nextItem); editMenu.add(lastItem); menuBar.add(editMenu); JMenu helpMenu = new JMenu("Help"); JMenuItem documentationItem = new JMenuItem("Documentation"); JMenuItem aboutItem = new JMenuItem("About"); helpMenu.add(documentationItem); helpMenu.add(aboutItem); menuBar.add(helpMenu); frame.setVisible(true); } }); } } class AddressBookFrame extends JFrame { public AddressBookFrame() { setLayout(new BorderLayout()); setTitle("Address Book"); setSize(DEFAULT_WIDTH, DEFAULT_HEIGHT); AddressBookToolBar toolBar = new AddressBookToolBar(); add(toolBar, BorderLayout.NORTH); AddressBookStatusBar aStatusBar = new AddressBookStatusBar("5"); add(aStatusBar, BorderLayout.SOUTH); AddressBookForm form = new AddressBookForm(); add(form, BorderLayout.CENTER); } public static final int DEFAULT_WIDTH = 500; public static final int DEFAULT_HEIGHT = 500; } /* Create toolbar buttons and add buttons to toolbar */ class AddressBookToolBar extends JPanel { public AddressBookToolBar() { setLayout(new FlowLayout(FlowLayout.LEFT)); JToolBar bar = new JToolBar(); JButton newButton = new JButton("New"); JButton editButton = new JButton("Edit"); JButton deleteButton = new JButton("Delete"); JButton findButton = new JButton("Find"); JButton firstButton = new JButton("First"); JButton previousButton = new JButton("Previous"); JButton nextButton = new JButton("Next"); JButton lastButton = new JButton("Last"); bar.add(newButton); bar.add(editButton); bar.add(deleteButton); bar.add(findButton); bar.add(firstButton); bar.add(previousButton); bar.add(nextButton); bar.add(lastButton); add(bar); } } /* Creates the status bar string */ class AddressBookStatusBar extends JPanel { public AddressBookStatusBar(String statusBarString) { setLayout(new FlowLayout(FlowLayout.LEFT)); this.statusBarString = new JLabel("Total number of people: " + statusBarString); add(this.statusBarString); } private JLabel statusBarString; private int totalContacts; } class AddressBookForm extends JPanel { public AddressBookForm() { this.setLayout(new GridLayout(2, 1)); JPanel formPanel = new JPanel(); formPanel.setLayout(new GridLayout(4, 2)); JTextField firstName = new JTextField(20); JTextField lastName = new JTextField(20); JTextField telephone = new JTextField(20); JTextField email = new JTextField(20); JLabel firstNameLabel = new JLabel("First Name: ", JLabel.LEFT); formPanel.add(firstNameLabel); formPanel.add(firstName); JLabel lastNameLabel = new JLabel("Last Name: ", JLabel.LEFT); formPanel.add(lastNameLabel); formPanel.add(lastName); JLabel telephoneLabel = new JLabel("Telephone: ", JLabel.LEFT); formPanel.add(telephoneLabel); formPanel.add(telephone); JLabel emailLabel = new JLabel("Email: ", JLabel.LEFT); formPanel.add(emailLabel); formPanel.add(email); add(formPanel); JPanel buttonPanel = new JPanel(); JButton insertButton = new JButton("Insert"); JButton displayButton = new JButton("Display"); // create button actions AddressBookManager insertAction = new AddressBookManager(firstName.getText()); insertButton.addActionListener(insertAction); buttonPanel.add(insertButton); buttonPanel.add(displayButton); add(buttonPanel); } private List<Person> addressList = new ArrayList<Person>(); private class AddressBookManager implements ActionListener { public AddressBookManager(String text) { // addressList.add( setName(text); System.out.println("Test" + text); } public void actionPerformed(ActionEvent e) { System.out.println("Hello" + name); } public void setName(String name) { this.name = name; } private String name; } } Second question is, how do I make my form not take up the whole center space. I don't like the stretch look and was hoping the JTextFields could be just one line long, not a big box. Same thing with the buttons. Any thoughts? Thanks.

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • New features of C# 4.0

    This article covers New features of C# 4.0. Article has been divided into below sections. Introduction. Dynamic Lookup. Named and Optional Arguments. Features for COM interop. Variance. Relationship with Visual Basic. Resources. Other interested readings… 22 New Features of Visual Studio 2008 for .NET Professionals 50 New Features of SQL Server 2008 IIS 7.0 New features Introduction It is now close to a year since Microsoft Visual C# 3.0 shipped as part of Visual Studio 2008. In the VS Managed Languages team we are hard at work on creating the next version of the language (with the unsurprising working title of C# 4.0), and this document is a first public description of the planned language features as we currently see them. Please be advised that all this is in early stages of production and is subject to change. Part of the reason for sharing our plans in public so early is precisely to get the kind of feedback that will cause us to improve the final product before it rolls out. Simultaneously with the publication of this whitepaper, a first public CTP (community technology preview) of Visual Studio 2010 is going out as a Virtual PC image for everyone to try. Please use it to play and experiment with the features, and let us know of any thoughts you have. We ask for your understanding and patience working with very early bits, where especially new or newly implemented features do not have the quality or stability of a final product. The aim of the CTP is not to give you a productive work environment but to give you the best possible impression of what we are working on for the next release. The CTP contains a number of walkthroughs, some of which highlight the new language features of C# 4.0. Those are excellent for getting a hands-on guided tour through the details of some common scenarios for the features. You may consider this whitepaper a companion document to these walkthroughs, complementing them with a focus on the overall language features and how they work, as opposed to the specifics of the concrete scenarios. C# 4.0 The major theme for C# 4.0 is dynamic programming. Increasingly, objects are “dynamic” in the sense that their structure and behavior is not captured by a static type, or at least not one that the compiler knows about when compiling your program. Some examples include a. objects from dynamic programming languages, such as Python or Ruby b. COM objects accessed through IDispatch c. ordinary .NET types accessed through reflection d. objects with changing structure, such as HTML DOM objects While C# remains a statically typed language, we aim to vastly improve the interaction with such objects. A secondary theme is co-evolution with Visual Basic. Going forward we will aim to maintain the individual character of each language, but at the same time important new features should be introduced in both languages at the same time. They should be differentiated more by style and feel than by feature set. The new features in C# 4.0 fall into four groups: Dynamic lookup Dynamic lookup allows you to write method, operator and indexer calls, property and field accesses, and even object invocations which bypass the C# static type checking and instead gets resolved at runtime. Named and optional parameters Parameters in C# can now be specified as optional by providing a default value for them in a member declaration. When the member is invoked, optional arguments can be omitted. Furthermore, any argument can be passed by parameter name instead of position. COM specific interop features Dynamic lookup as well as named and optional parameters both help making programming against COM less painful than today. On top of that, however, we are adding a number of other small features that further improve the interop experience. Variance It used to be that an IEnumerable<string> wasn’t an IEnumerable<object>. Now it is – C# embraces type safe “co-and contravariance” and common BCL types are updated to take advantage of that. Dynamic Lookup Dynamic lookup allows you a unified approach to invoking things dynamically. With dynamic lookup, when you have an object in your hand you do not need to worry about whether it comes from COM, IronPython, the HTML DOM or reflection; you just apply operations to it and leave it to the runtime to figure out what exactly those operations mean for that particular object. This affords you enormous flexibility, and can greatly simplify your code, but it does come with a significant drawback: Static typing is not maintained for these operations. A dynamic object is assumed at compile time to support any operation, and only at runtime will you get an error if it wasn’t so. Oftentimes this will be no loss, because the object wouldn’t have a static type anyway, in other cases it is a tradeoff between brevity and safety. In order to facilitate this tradeoff, it is a design goal of C# to allow you to opt in or opt out of dynamic behavior on every single call. The dynamic type C# 4.0 introduces a new static type called dynamic. When you have an object of type dynamic you can “do things to it” that are resolved only at runtime: dynamic d = GetDynamicObject(…); d.M(7); The C# compiler allows you to call a method with any name and any arguments on d because it is of type dynamic. At runtime the actual object that d refers to will be examined to determine what it means to “call M with an int” on it. The type dynamic can be thought of as a special version of the type object, which signals that the object can be used dynamically. It is easy to opt in or out of dynamic behavior: any object can be implicitly converted to dynamic, “suspending belief” until runtime. Conversely, there is an “assignment conversion” from dynamic to any other type, which allows implicit conversion in assignment-like constructs: dynamic d = 7; // implicit conversion int i = d; // assignment conversion Dynamic operations Not only method calls, but also field and property accesses, indexer and operator calls and even delegate invocations can be dispatched dynamically: dynamic d = GetDynamicObject(…); d.M(7); // calling methods d.f = d.P; // getting and settings fields and properties d[“one”] = d[“two”]; // getting and setting thorugh indexers int i = d + 3; // calling operators string s = d(5,7); // invoking as a delegate The role of the C# compiler here is simply to package up the necessary information about “what is being done to d”, so that the runtime can pick it up and determine what the exact meaning of it is given an actual object d. Think of it as deferring part of the compiler’s job to runtime. The result of any dynamic operation is itself of type dynamic. Runtime lookup At runtime a dynamic operation is dispatched according to the nature of its target object d: COM objects If d is a COM object, the operation is dispatched dynamically through COM IDispatch. This allows calling to COM types that don’t have a Primary Interop Assembly (PIA), and relying on COM features that don’t have a counterpart in C#, such as indexed properties and default properties. Dynamic objects If d implements the interface IDynamicObject d itself is asked to perform the operation. Thus by implementing IDynamicObject a type can completely redefine the meaning of dynamic operations. This is used intensively by dynamic languages such as IronPython and IronRuby to implement their own dynamic object models. It will also be used by APIs, e.g. by the HTML DOM to allow direct access to the object’s properties using property syntax. Plain objects Otherwise d is a standard .NET object, and the operation will be dispatched using reflection on its type and a C# “runtime binder” which implements C#’s lookup and overload resolution semantics at runtime. This is essentially a part of the C# compiler running as a runtime component to “finish the work” on dynamic operations that was deferred by the static compiler. Example Assume the following code: dynamic d1 = new Foo(); dynamic d2 = new Bar(); string s; d1.M(s, d2, 3, null); Because the receiver of the call to M is dynamic, the C# compiler does not try to resolve the meaning of the call. Instead it stashes away information for the runtime about the call. This information (often referred to as the “payload”) is essentially equivalent to: “Perform an instance method call of M with the following arguments: 1. a string 2. a dynamic 3. a literal int 3 4. a literal object null” At runtime, assume that the actual type Foo of d1 is not a COM type and does not implement IDynamicObject. In this case the C# runtime binder picks up to finish the overload resolution job based on runtime type information, proceeding as follows: 1. Reflection is used to obtain the actual runtime types of the two objects, d1 and d2, that did not have a static type (or rather had the static type dynamic). The result is Foo for d1 and Bar for d2. 2. Method lookup and overload resolution is performed on the type Foo with the call M(string,Bar,3,null) using ordinary C# semantics. 3. If the method is found it is invoked; otherwise a runtime exception is thrown. Overload resolution with dynamic arguments Even if the receiver of a method call is of a static type, overload resolution can still happen at runtime. This can happen if one or more of the arguments have the type dynamic: Foo foo = new Foo(); dynamic d = new Bar(); var result = foo.M(d); The C# runtime binder will choose between the statically known overloads of M on Foo, based on the runtime type of d, namely Bar. The result is again of type dynamic. The Dynamic Language Runtime An important component in the underlying implementation of dynamic lookup is the Dynamic Language Runtime (DLR), which is a new API in .NET 4.0. The DLR provides most of the infrastructure behind not only C# dynamic lookup but also the implementation of several dynamic programming languages on .NET, such as IronPython and IronRuby. Through this common infrastructure a high degree of interoperability is ensured, but just as importantly the DLR provides excellent caching mechanisms which serve to greatly enhance the efficiency of runtime dispatch. To the user of dynamic lookup in C#, the DLR is invisible except for the improved efficiency. However, if you want to implement your own dynamically dispatched objects, the IDynamicObject interface allows you to interoperate with the DLR and plug in your own behavior. This is a rather advanced task, which requires you to understand a good deal more about the inner workings of the DLR. For API writers, however, it can definitely be worth the trouble in order to vastly improve the usability of e.g. a library representing an inherently dynamic domain. Open issues There are a few limitations and things that might work differently than you would expect. · The DLR allows objects to be created from objects that represent classes. However, the current implementation of C# doesn’t have syntax to support this. · Dynamic lookup will not be able to find extension methods. Whether extension methods apply or not depends on the static context of the call (i.e. which using clauses occur), and this context information is not currently kept as part of the payload. · Anonymous functions (i.e. lambda expressions) cannot appear as arguments to a dynamic method call. The compiler cannot bind (i.e. “understand”) an anonymous function without knowing what type it is converted to. One consequence of these limitations is that you cannot easily use LINQ queries over dynamic objects: dynamic collection = …; var result = collection.Select(e => e + 5); If the Select method is an extension method, dynamic lookup will not find it. Even if it is an instance method, the above does not compile, because a lambda expression cannot be passed as an argument to a dynamic operation. There are no plans to address these limitations in C# 4.0. Named and Optional Arguments Named and optional parameters are really two distinct features, but are often useful together. Optional parameters allow you to omit arguments to member invocations, whereas named arguments is a way to provide an argument using the name of the corresponding parameter instead of relying on its position in the parameter list. Some APIs, most notably COM interfaces such as the Office automation APIs, are written specifically with named and optional parameters in mind. Up until now it has been very painful to call into these APIs from C#, with sometimes as many as thirty arguments having to be explicitly passed, most of which have reasonable default values and could be omitted. Even in APIs for .NET however you sometimes find yourself compelled to write many overloads of a method with different combinations of parameters, in order to provide maximum usability to the callers. Optional parameters are a useful alternative for these situations. Optional parameters A parameter is declared optional simply by providing a default value for it: public void M(int x, int y = 5, int z = 7); Here y and z are optional parameters and can be omitted in calls: M(1, 2, 3); // ordinary call of M M(1, 2); // omitting z – equivalent to M(1, 2, 7) M(1); // omitting both y and z – equivalent to M(1, 5, 7) Named and optional arguments C# 4.0 does not permit you to omit arguments between commas as in M(1,,3). This could lead to highly unreadable comma-counting code. Instead any argument can be passed by name. Thus if you want to omit only y from a call of M you can write: M(1, z: 3); // passing z by name or M(x: 1, z: 3); // passing both x and z by name or even M(z: 3, x: 1); // reversing the order of arguments All forms are equivalent, except that arguments are always evaluated in the order they appear, so in the last example the 3 is evaluated before the 1. Optional and named arguments can be used not only with methods but also with indexers and constructors. Overload resolution Named and optional arguments affect overload resolution, but the changes are relatively simple: A signature is applicable if all its parameters are either optional or have exactly one corresponding argument (by name or position) in the call which is convertible to the parameter type. Betterness rules on conversions are only applied for arguments that are explicitly given – omitted optional arguments are ignored for betterness purposes. If two signatures are equally good, one that does not omit optional parameters is preferred. M(string s, int i = 1); M(object o); M(int i, string s = “Hello”); M(int i); M(5); Given these overloads, we can see the working of the rules above. M(string,int) is not applicable because 5 doesn’t convert to string. M(int,string) is applicable because its second parameter is optional, and so, obviously are M(object) and M(int). M(int,string) and M(int) are both better than M(object) because the conversion from 5 to int is better than the conversion from 5 to object. Finally M(int) is better than M(int,string) because no optional arguments are omitted. Thus the method that gets called is M(int). Features for COM interop Dynamic lookup as well as named and optional parameters greatly improve the experience of interoperating with COM APIs such as the Office Automation APIs. In order to remove even more of the speed bumps, a couple of small COM-specific features are also added to C# 4.0. Dynamic import Many COM methods accept and return variant types, which are represented in the PIAs as object. In the vast majority of cases, a programmer calling these methods already knows the static type of a returned object from context, but explicitly has to perform a cast on the returned value to make use of that knowledge. These casts are so common that they constitute a major nuisance. In order to facilitate a smoother experience, you can now choose to import these COM APIs in such a way that variants are instead represented using the type dynamic. In other words, from your point of view, COM signatures now have occurrences of dynamic instead of object in them. This means that you can easily access members directly off a returned object, or you can assign it to a strongly typed local variable without having to cast. To illustrate, you can now say excel.Cells[1, 1].Value = "Hello"; instead of ((Excel.Range)excel.Cells[1, 1]).Value2 = "Hello"; and Excel.Range range = excel.Cells[1, 1]; instead of Excel.Range range = (Excel.Range)excel.Cells[1, 1]; Compiling without PIAs Primary Interop Assemblies are large .NET assemblies generated from COM interfaces to facilitate strongly typed interoperability. They provide great support at design time, where your experience of the interop is as good as if the types where really defined in .NET. However, at runtime these large assemblies can easily bloat your program, and also cause versioning issues because they are distributed independently of your application. The no-PIA feature allows you to continue to use PIAs at design time without having them around at runtime. Instead, the C# compiler will bake the small part of the PIA that a program actually uses directly into its assembly. At runtime the PIA does not have to be loaded. Omitting ref Because of a different programming model, many COM APIs contain a lot of reference parameters. Contrary to refs in C#, these are typically not meant to mutate a passed-in argument for the subsequent benefit of the caller, but are simply another way of passing value parameters. It therefore seems unreasonable that a C# programmer should have to create temporary variables for all such ref parameters and pass these by reference. Instead, specifically for COM methods, the C# compiler will allow you to pass arguments by value to such a method, and will automatically generate temporary variables to hold the passed-in values, subsequently discarding these when the call returns. In this way the caller sees value semantics, and will not experience any side effects, but the called method still gets a reference. Open issues A few COM interface features still are not surfaced in C#. Most notably these include indexed properties and default properties. As mentioned above these will be respected if you access COM dynamically, but statically typed C# code will still not recognize them. There are currently no plans to address these remaining speed bumps in C# 4.0. Variance An aspect of generics that often comes across as surprising is that the following is illegal: IList<string> strings = new List<string>(); IList<object> objects = strings; The second assignment is disallowed because strings does not have the same element type as objects. There is a perfectly good reason for this. If it were allowed you could write: objects[0] = 5; string s = strings[0]; Allowing an int to be inserted into a list of strings and subsequently extracted as a string. This would be a breach of type safety. However, there are certain interfaces where the above cannot occur, notably where there is no way to insert an object into the collection. Such an interface is IEnumerable<T>. If instead you say: IEnumerable<object> objects = strings; There is no way we can put the wrong kind of thing into strings through objects, because objects doesn’t have a method that takes an element in. Variance is about allowing assignments such as this in cases where it is safe. The result is that a lot of situations that were previously surprising now just work. Covariance In .NET 4.0 the IEnumerable<T> interface will be declared in the following way: public interface IEnumerable<out T> : IEnumerable { IEnumerator<T> GetEnumerator(); } public interface IEnumerator<out T> : IEnumerator { bool MoveNext(); T Current { get; } } The “out” in these declarations signifies that the T can only occur in output position in the interface – the compiler will complain otherwise. In return for this restriction, the interface becomes “covariant” in T, which means that an IEnumerable<A> is considered an IEnumerable<B> if A has a reference conversion to B. As a result, any sequence of strings is also e.g. a sequence of objects. This is useful e.g. in many LINQ methods. Using the declarations above: var result = strings.Union(objects); // succeeds with an IEnumerable<object> This would previously have been disallowed, and you would have had to to some cumbersome wrapping to get the two sequences to have the same element type. Contravariance Type parameters can also have an “in” modifier, restricting them to occur only in input positions. An example is IComparer<T>: public interface IComparer<in T> { public int Compare(T left, T right); } The somewhat baffling result is that an IComparer<object> can in fact be considered an IComparer<string>! It makes sense when you think about it: If a comparer can compare any two objects, it can certainly also compare two strings. This property is referred to as contravariance. A generic type can have both in and out modifiers on its type parameters, as is the case with the Func<…> delegate types: public delegate TResult Func<in TArg, out TResult>(TArg arg); Obviously the argument only ever comes in, and the result only ever comes out. Therefore a Func<object,string> can in fact be used as a Func<string,object>. Limitations Variant type parameters can only be declared on interfaces and delegate types, due to a restriction in the CLR. Variance only applies when there is a reference conversion between the type arguments. For instance, an IEnumerable<int> is not an IEnumerable<object> because the conversion from int to object is a boxing conversion, not a reference conversion. Also please note that the CTP does not contain the new versions of the .NET types mentioned above. In order to experiment with variance you have to declare your own variant interfaces and delegate types. COM Example Here is a larger Office automation example that shows many of the new C# features in action. using System; using System.Diagnostics; using System.Linq; using Excel = Microsoft.Office.Interop.Excel; using Word = Microsoft.Office.Interop.Word; class Program { static void Main(string[] args) { var excel = new Excel.Application(); excel.Visible = true; excel.Workbooks.Add(); // optional arguments omitted excel.Cells[1, 1].Value = "Process Name"; // no casts; Value dynamically excel.Cells[1, 2].Value = "Memory Usage"; // accessed var processes = Process.GetProcesses() .OrderByDescending(p =&gt; p.WorkingSet) .Take(10); int i = 2; foreach (var p in processes) { excel.Cells[i, 1].Value = p.ProcessName; // no casts excel.Cells[i, 2].Value = p.WorkingSet; // no casts i++; } Excel.Range range = excel.Cells[1, 1]; // no casts Excel.Chart chart = excel.ActiveWorkbook.Charts. Add(After: excel.ActiveSheet); // named and optional arguments chart.ChartWizard( Source: range.CurrentRegion, Title: "Memory Usage in " + Environment.MachineName); //named+optional chart.ChartStyle = 45; chart.CopyPicture(Excel.XlPictureAppearance.xlScreen, Excel.XlCopyPictureFormat.xlBitmap, Excel.XlPictureAppearance.xlScreen); var word = new Word.Application(); word.Visible = true; word.Documents.Add(); // optional arguments word.Selection.Paste(); } } The code is much more terse and readable than the C# 3.0 counterpart. Note especially how the Value property is accessed dynamically. This is actually an indexed property, i.e. a property that takes an argument; something which C# does not understand. However the argument is optional. Since the access is dynamic, it goes through the runtime COM binder which knows to substitute the default value and call the indexed property. Thus, dynamic COM allows you to avoid accesses to the puzzling Value2 property of Excel ranges. Relationship with Visual Basic A number of the features introduced to C# 4.0 already exist or will be introduced in some form or other in Visual Basic: · Late binding in VB is similar in many ways to dynamic lookup in C#, and can be expected to make more use of the DLR in the future, leading to further parity with C#. · Named and optional arguments have been part of Visual Basic for a long time, and the C# version of the feature is explicitly engineered with maximal VB interoperability in mind. · NoPIA and variance are both being introduced to VB and C# at the same time. VB in turn is adding a number of features that have hitherto been a mainstay of C#. As a result future versions of C# and VB will have much better feature parity, for the benefit of everyone. Resources All available resources concerning C# 4.0 can be accessed through the C# Dev Center. Specifically, this white paper and other resources can be found at the Code Gallery site. Enjoy! span.fullpost {display:none;}

    Read the article

  • Macbook OSX applications crashing on startup

    - by Ryan Doom
    Closed my computer last night, went home. Opened it and it had restarted. Now when I open a couple programs such as Adobe Fireworks or Appcelerator Titanium they throw up a nasty error like below. Other programs (Chrome, Firefox, Textmate, Versions) works fine. Any thoughts on this? I haven't owned my Macbook long so I'm not even aware of the right tools or places to look to track this down. Any help would be most appreciated. It's making it hard to get my work done :] If it helps at all both those programs were probably open when it restarted. From the look of it I'm not sure if it's a permissions error or something? I completely re-installed on of the applications Titanium. Didn't seem to help. Process: Adobe Fireworks CS5 [1044] Path: /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/Adobe Fireworks CS5 Identifier: com.macromedia.fireworks Version: Adobe Fireworks CS5 version 11.0.0.484 (11.0.0) Code Type: X86 (Native) Parent Process: launchd [87] Date/Time: 2011-02-18 09:45:47.689 -0500 OS Version: Mac OS X 10.6.6 (10J567) Report Version: 6 Interval Since Last Report: 12983 sec Crashes Since Last Report: 6 Per-App Interval Since Last Report: 325365 sec Per-App Crashes Since Last Report: 4 Anonymous UUID: D16EAFE7-2F04-44D4-A984-5902A6EF8943 Exception Type: EXC_BAD_ACCESS (SIGBUS) Exception Codes: KERN_PROTECTION_FAILURE at 0x00000000b0327ff8 Crashed Thread: 7 Thread 0: Dispatch queue: com.apple.main-thread 0 libSystem.B.dylib 0x97dd0142 semaphore_wait_signal_trap + 10 1 libSystem.B.dylib 0x97dd5c46 pthread_mutex_lock + 490 2 libstdc++.6.dylib 0x91887559 __gnu_cxx::__recursive_mutex::lock() + 17 3 libstdc++.6.dylib 0x918874e6 __cxa_guard_acquire + 68 4 libTrueTypeScaler.dylib 0x91c92ab3 TTScalerInfo() + 50 5 libFontParser.dylib 0x9979a5f1 TTrueTypeScaler::CreateTrueTypeScaler() + 43 6 libSystem.B.dylib 0x97dee900 pthread_once + 82 7 libFontParser.dylib 0x9979a575 TTrueTypeScaler::GetTrueTypeScaler() + 47 8 libFontParser.dylib 0x9979a520 TTrueTypeScaler::TTrueTypeScaler(TScalerStrike const&) + 26 9 libFontParser.dylib 0x9979a4be TFontScaler::CreateFontScaler(TScalerStrike const&) + 52 10 libFontParser.dylib 0x9979bd93 FPFontGetGlyphsForUnichars + 344 11 com.apple.CoreText 0x98255cfe TBaseFont::CalculateFontMetrics(bool) const + 342 12 com.apple.CoreText 0x98255b55 TBaseFont::InitFontMetrics() const + 51 13 com.apple.CoreText 0x98255959 TBaseFont::GetStrikeMetrics(float, CGAffineTransform const*, bool) const + 81 14 com.apple.CoreText 0x982558cd TFont::InitStrikeMetrics() const + 55 15 com.apple.CoreText 0x982592cf CTFontGetAscent + 49 16 com.apple.AppKit 0x989f5d08 _NSFontInstanceInfoInitializeMetricsInfo + 48 17 com.apple.AppKit 0x989f5cbc -[_NSSharedFontInstanceInfo _defaultLineHeight:] + 40 18 com.apple.AppKit 0x98f3c5e8 +[NSStringDrawingTextStorage _fastDrawString:attributes:length:inRect:graphicsContext:baselineRendering:usesFontLeading:usesScreenFont:typesetterBehavior:paragraphStyle:lineBreakMode:boundingRect:padding:scrollable:] + 2041 19 com.apple.AppKit 0x98abd2d9 _NSStringDrawingCore + 1555 20 com.apple.AppKit 0x98abca8b _NSDrawTextCell + 3465 21 com.apple.AppKit 0x98ac6185 -[NSTextFieldCell drawInteriorWithFrame:inView:] + 764 22 com.apple.AppKit 0x98ac5d26 -[NSTextFieldCell drawWithFrame:inView:] + 816 23 com.apple.AppKit 0x98ac03de -[NSControl drawRect:] + 589 24 com.apple.AppKit 0x98ab882a -[NSView _drawRect:clip:] + 3510 25 com.apple.AppKit 0x98ab74c8 -[NSView _recursiveDisplayAllDirtyWithLockFocus:visRect:] + 1600 26 com.apple.AppKit 0x98ab77fd -[NSView _recursiveDisplayAllDirtyWithLockFocus:visRect:] + 2421 27 com.apple.AppKit 0x98ab77fd -[NSView _recursiveDisplayAllDirtyWithLockFocus:visRect:] + 2421 28 com.apple.AppKit 0x98ab59e7 -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:topView:] + 711 29 com.apple.AppKit 0x98b54aa3 -[NSNextStepFrame _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:topView:] + 311 30 com.apple.AppKit 0x98ab1ea2 -[NSView _displayRectIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:] + 3309 31 com.apple.AppKit 0x98a12a57 -[NSView displayIfNeeded] + 818 32 com.apple.AppKit 0x989c6661 -[NSNextStepFrame displayIfNeeded] + 98 33 com.apple.AppKit 0x98b55390 -[NSWindow display] + 75 34 com.macromedia.fireworks 0x00bade98 0x1000 + 12242584 35 com.macromedia.fireworks 0x0089f778 0x1000 + 9037688 36 libPowerPlant2.dylib 0x08109722 FW_PowerPlant::LCarbonApp::Run() + 54 37 com.macromedia.fireworks 0x008a138c 0x1000 + 9044876 38 com.macromedia.fireworks 0x00003596 0x1000 + 9622 Thread 1: Dispatch queue: com.apple.libdispatch-manager 0 libSystem.B.dylib 0x97df6982 kevent + 10 1 libSystem.B.dylib 0x97df709c _dispatch_mgr_invoke + 215 2 libSystem.B.dylib 0x97df6559 _dispatch_queue_invoke + 163 3 libSystem.B.dylib 0x97df62fe _dispatch_worker_thread2 + 240 4 libSystem.B.dylib 0x97df5d81 _pthread_wqthread + 390 5 libSystem.B.dylib 0x97df5bc6 start_wqthread + 30 Thread 2: 0 libSystem.B.dylib 0x97df5a12 __workq_kernreturn + 10 1 libSystem.B.dylib 0x97df5fa8 _pthread_wqthread + 941 2 libSystem.B.dylib 0x97df5bc6 start_wqthread + 30 Thread 3: 0 libSystem.B.dylib 0x97dd015a semaphore_timedwait_signal_trap + 10 1 libSystem.B.dylib 0x97dfdce5 _pthread_cond_wait + 1066 2 libSystem.B.dylib 0x97e2cac8 pthread_cond_timedwait_relative_np + 47 3 ...ple.CoreServices.CarbonCore 0x97af4ecd TSWaitOnConditionTimedRelative + 242 4 ...ple.CoreServices.CarbonCore 0x97af4c0b TSWaitOnSemaphoreCommon + 511 5 ...ple.CoreServices.CarbonCore 0x97b18e33 TimerThread + 97 6 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 7 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 4: 0 libSystem.B.dylib 0x97dd00fa mach_msg_trap + 10 1 libSystem.B.dylib 0x97dd0867 mach_msg + 68 2 ...ple.CoreServices.CarbonCore 0x97b9e0d0 YieldToThread + 446 3 ...ple.CoreServices.CarbonCore 0x97b9e1d3 SetThreadState + 134 4 ...ple.CoreServices.CarbonCore 0x97b9e28e SetThreadStateEndCritical + 111 5 libPowerPlant2.dylib 0x0811ab51 FW_PowerPlant::LThread::SemWait(FW_PowerPlant::LSemaphore*, long, QHdr&, unsigned char&) + 119 6 libPowerPlant2.dylib 0x08119b07 FW_PowerPlant::LSemaphore::BlockThread(long) + 61 7 libPowerPlant2.dylib 0x08119b6d FW_PowerPlant::LSemaphore::Wait(long) + 71 8 libPowerPlant2.dylib 0x0811af70 FW_PowerPlant::LThread::Cleanup::Run() + 32 9 libPowerPlant2.dylib 0x0811b94e FW_PowerPlant::LThread::DoEntry(void*) + 30 10 ...ple.CoreServices.CarbonCore 0x97b9e85f CooperativeThread + 309 11 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 12 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 5: 0 libSystem.B.dylib 0x97dd0142 semaphore_wait_signal_trap + 10 1 libSystem.B.dylib 0x97dfdcfc _pthread_cond_wait + 1089 2 libSystem.B.dylib 0x97e4646f pthread_cond_wait + 48 3 com.adobe.amt.services 0x1dd73126 AMTConditionLock::LockWhenCondition(int) + 46 4 com.adobe.amt.services 0x1dd6bdb0 _AMTThreadedPCDService::PCDThreadWorker(_AMTThreadedPCDService*) + 116 5 com.adobe.amt.services 0x1dd7318c AMTThread::Worker(void*) + 24 6 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 7 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 6: 0 libSystem.B.dylib 0x97dfe0a6 __semwait_signal + 10 1 libSystem.B.dylib 0x97dfdd62 _pthread_cond_wait + 1191 2 libSystem.B.dylib 0x97dff9f8 pthread_cond_wait$UNIX2003 + 73 3 ...ple.CoreServices.CarbonCore 0x97b0951e TSWaitOnCondition + 126 4 ...ple.CoreServices.CarbonCore 0x97af4ea5 TSWaitOnConditionTimedRelative + 202 5 ...ple.CoreServices.CarbonCore 0x97af0873 MPWaitOnQueue + 250 6 com.macromedia.fireworks 0x00ae43cf 0x1000 + 11416527 7 ...ple.CoreServices.CarbonCore 0x97ad485a PrivateMPEntryPoint + 68 8 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 9 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 7 Crashed: 0 libstdc++.6.dylib 0x9184e00c std::basic_ofstream ::open(char const*, std::_Ios_Openmode) + 16 1 libstdc++.6.dylib 0x9184fe9b std::basic_ifstream ::basic_ifstream(char const*, std::_Ios_Openmode) + 211 2 ...pdaterNotificationFramework 0x1e824779 ESDifstream::ESDifstream(std::string const&, char const*, std::_Ios_Openmode) + 73 3 ...pdaterNotificationFramework 0x1e821b6a esd::ExpatDOMBuilder::ParseFile(std::string const&, bool) + 96 4 ...pdaterNotificationFramework 0x1e822da4 esd::PrefsWriter::SetPrefsPath(std::string const&) + 206 5 ...pdaterNotificationFramework 0x1e8449b3 AdobeUpdaterPrefs::AdobeUpdaterPrefs() + 8609 6 ...pdaterNotificationFramework 0x1e8459f4 AdobeUpdaterPrefs::GetAdobeUpdaterPrefs() + 68 7 ...pdaterNotificationFramework 0x1e820728 UpdaterNotificationsImpl::InitLogFile() + 48 8 ...pdaterNotificationFramework 0x1e820d49 UpdaterNotificationsImpl::Instance() + 53 9 ...pdaterNotificationFramework 0x1e823638 UpdaterNotificationsIsUpdaterEnabled + 22 10 com.adobe.amt.services 0x1dd69d15 _AMTAUMService::IsUpdaterEnabled(T_CSUStatusMajor*, int*) + 359 11 com.adobe.amtlib 0x01f5501c AMTAUMServiceIsUpdaterEnabled + 290 12 com.adobe.amtlib 0x01f1f789 AMTImpl::CallMenuEnablers() + 71 13 com.adobe.amtlib 0x01f260fa AMTImpl::DoLaunchWorkflow(AMTImpl::LaunchSequence) + 1664 14 com.adobe.amtlib 0x01f26a5d AMTImpl::DoValidateWorkflow(AMTImpl::LaunchSequence) + 293 15 com.adobe.amtlib 0x01f26cf5 AMTImpl::DoPreValidateWorkflow() + 119 16 com.adobe.amtlib 0x01f26e71 AMTImpl::ServiceLoaderThread(void*) + 45 17 com.adobe.amtlib 0x01f54c48 AMTThread::Worker(void*) + 24 18 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 19 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 7 crashed with X86 Thread State (32-bit): eax: 0x00000016 ebx: 0x098c9a00 ecx: 0xa013dfc0 edx: 0x00000003 edi: 0x098c9a08 esi: 0x098c9c0c ebp: 0xb03a7448 esp: 0xb0327ff0 ss: 0x0000001f efl: 0x00010202 eip: 0x9184e00c cs: 0x00000017 ds: 0x0000001f es: 0x0000001f fs: 0x0000001f gs: 0x00000037 cr2: 0xb0327ff8 Binary Images: 0x1000 - 0x1448ff1 +com.macromedia.fireworks Adobe Fireworks CS5 version 11.0.0.484 (11.0.0) <38213EBD-FDB0-FC20-40E8-87935A5386BB /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/Adobe Fireworks CS5 0x1e76000 - 0x1ec9ffb +com.adobe.headlights.LogSessionFramework ??? (2.0.1.011) <4F2BFF03-01D2-A07D-E5E2-7F88D4C2DEC4 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/LogSession.framework/Versions/A/LogSession 0x1f11000 - 0x1f77ffb +com.adobe.amtlib amtlib 3.0.0.64 (3.0.0.64) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/amtlib.framework/Versions/A/amtlib 0x1fa7000 - 0x2146fe7 +com.adobe.owl AdobeOwl version 3.0.81 (3.0.81) <9C261D9E-9BD7-5DE6-5676-AEEF4828D17B /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeOwl.framework/Versions/A/AdobeOwl 0x21af000 - 0x22e7fe7 +WRServices ??? (???) <52CE5B97-1E6A-92A2-EA70-93511AB7EA2E /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/WRServices.framework/Versions/A/WRServices 0x232d000 - 0x239afef +FileInfo ??? (???) <4A4C74F9-CA83-B174-F56D-F7671DC61389 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/FileInfo.framework/Versions/A/FileInfo 0x23b5000 - 0x23dbff6 +AdobeAXE8SharedExpat ??? (???) <5848BBCE-3A3E-66EE-5527-97A96F0CA4CC /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeAXE8SharedExpat.framework/Versions/A/AdobeAXE8SharedExpat 0x23ec000 - 0x2407fff +AdobeBIB ??? (???) <3B3092DC-A296-9D1C-1922-D20E6A5A7D7E /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeBIB.framework/Versions/A/AdobeBIB 0x2411000 - 0x2469ff7 +AdobeXMP ??? (???) <73329999-C364-2451-6574-4D0277057D19 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeXMP.framework/Versions/A/AdobeXMP 0x2478000 - 0x2aa6fe7 +AdobeAGM ??? (???) <91D37E54-E985-47E1-2696-0BD7E4183132 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeAGM.framework/Versions/A/AdobeAGM 0x2c04000 - 0x2d18fff +AdobeACE ??? (???) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeACE.framework/Versions/A/AdobeACE 0x2d3b000 - 0x302dff7 +AdobeCoolType ??? (???) <9FDD596D-9824-2BB9-5DA2-25DACAB6A324 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeCoolType.framework/Versions/A/AdobeCoolType 0x30b5000 - 0x30d6ff7 +AdobeBIBUtils ??? (???) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeBIBUtils.framework/Versions/A/AdobeBIBUtils 0x30e2000 - 0x311efff +AdobeARE ??? (???) <76851E91-2381-5D05-742C-BB24E4BAD276 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeARE.framework/Versions/A/AdobeARE 0x3127000 - 0x34ffff7 +AdobeMPS ??? (???) <13614867-4D80-EB74-FA7F-6136492478BA /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeMPS.framework/Versions/A/AdobeMPS 0x362e000 - 0x3c62feb +AdobePDFL ??? (???) <49D6D58A-1EBB-424A-4CB0-8F9691E0991D /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobePDFL.framework/Versions/A/AdobePDFL 0x3d8e000 - 0x4ad1fff +com.adobe.psl AdobePSL 12.0.0.7524 (12.0.0.7524) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobePSL.framework/Versions/A/AdobePSL 0x4e10000 - 0x4e9aff7 +com.adobe.AdobeScCore ScCore 4.1.7 (4.1.7.5522) <053A109E-3E3E-D3EE-7186-4920D927D2AD /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeScCore.framework/Versions/A/AdobeScCore 0x4edd000 - 0x4fc0fef +AdobePDFPort ??? (???) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobePDFPort.framework/Versions/A/AdobePDFPort 0x4ff5000 - 0x4ff8ff8 +com.adobe.ape.shim adbeape version 3.1.65.7508 (3.1.65.7508) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/adbeape.framework/Versions/A/adbeape 0x4ffe000 - 0x508fff7 +libicucnv.dylib.36.0 36.0.0 (compatibility 36.0.0) <581475CC-C039-1B42-49BA-71811D8B4E15 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUConverter.framework/Versions/3.6/libicucnv.dylib.36.0 0x50ae000 - 0x5a5efff +libicudata.dylib.36.0 36.0.0 (compatibility 36.0.0) <02108DEA-3DD2-14BE-DAEB-BE522B619C1D /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUData.framework/Versions/3.6/libicudata.dylib.36.0 0x5a61000 - 0x5b2eff3 +libicui18n.dylib.36.0 36.0.0 (compatibility 36.0.0) <08F15219-7F35-574E-7725-1ACAA1B18A00 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUInternationalization.framework/Versions/3.6/libicui18n.dylib.36.0 0x5b91000 - 0x5c6bfef +libicuuc.dylib.36.0 36.0.0 (compatibility 36.0.0) <5EE72009-40B3-7FB7-3A49-576AEDE0D400 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUUnicode.framework/Versions/3.6/libicuuc.dylib.36.0 0x5cab000 - 0x6a36fe7 +com.adobe.illustrator 382 (15.0.0) <64F68532-0311-6BBA-1F50-246CAF917549 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AILib.framework/Versions/A/AILib 0x781b000 - 0x785ffff +com.adobe.illustrator.aiport AIPort version 1.0 (1.0) <69EDC44E-D7BB-A259-282D-C42725AE0E26 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AIPort.framework/Versions/A/AIPort 0x78c2000 - 0x7908fff +FilterPort ??? (???) <23FAE9D1-9376-1E71-21F7-D3EB2BFD50EE /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/FilterPort.framework/Versions/A/FilterPort 0x797d000 - 0x797dfff +SPBasic ??? (???) <5D1760D8-C910-C641-0BC9-CF74A1A5190D /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/SPBasic.framework/Versions/A/SPBasic 0x7981000 - 0x7b67ff7 +com.adobe.linguistic.LinguisticManager 5.0.0 (11309) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeLinguistic.framework/Versions/3/AdobeLinguistic 0x7bf5000 - 0x7cc2fe7 +AdobeAXEDOMCore ??? (???) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeAXEDOMCore.framework/Versions/A/AdobeAXEDOMCore 0x7d31000 - 0x7ea9ffb +com.adobe.PlugPlug 2.0.0.746 (2.0.0.746) <08AD22E3-34C0-6749-E497-616C66A246AD /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/PlugPlug.framework/Versions/A/PlugPlug 0x7f4d000 - 0x7f6afef +libCurl.dylib ??? (???) <1BA6E2DE-EF14-D50A-4697-035AE07875D7 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/libCurl.dylib 0x7f72000 - 0x7f88ff4 +libChar16.dylib ??? (???) <19B0479C-72B1-EE14-6385-7F655DEC0F02 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/libChar16.dylib 0x7f90000 - 0x7fb3fe0 +libCoreTypes.dylib ??? (???) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/libCoreTypes.dylib 0x7fc3000 - 0x7fcaffc com.apple.carbonframeworktemplate 1.0 (1.0) <0D270CC7-B715-943E-2B4F-5C9B5775505A /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/NetIO.framework/Versions/A/NetIO 0x7fd6000 - 0x7fd9fff +Dioxide.dylib ??? (???) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/Dioxide.dylib 0x7fe1000 - 0x7fe7ffc +libfwutility.dylib ??? (???) <6A723D9E-A60B-56EE-2B8D-B91991793749 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/libfwutility.dylib 0x7fee000 - 0x803efff +com.macromedia.javascript Javascript version 1.0 (1.0) <540CB029-3946-8E41-BD91-AED6F73C86B7 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/Javascript.framework/Versions/A/Javascript 0x8053000 - 0x8060fff +com.macromedia.moa Moa version 1.0 (1.0) <3C4B7F42-5A5D-78E7-B1DC-DAA06A99CCB2 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/Moa.framework/Versions/A/Moa 0x8069000 - 0x8070fff +com.macromedia.morefiles MoreFiles version 1.0 (1.0) <36115C66-79A3-5DB9-B36B-8D655B46FC76 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/MoreFiles.framework/Versions/A/MoreFiles 0x8077000 - 0x815bfe3 +libPowerPlant2.dylib ??? (???) <964FB3D7-B7EE-94EB-FD95-4AE90C657A4A /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/libPowerPlant2.dylib 0x828e000 - 0x8294ffb +com.macromedia.testframework 1.0 (1.0) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/uwchar.framework/Versions/A/uwchar 0x8298000 - 0x829cffc +com.adobe.AdobeCrashReporter 3.0 (3.0.20100302) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeCrashReporter.framework/Versions/A/AdobeCrashReporter 0x82a3000 - 0x82bbfef +libgiff.dylib ??? (???) <8F90552B-3D11-2B1E-D1BA-A109FEB99969 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/libgiff.dylib 0x82c3000 - 0x82e1fe7 +com.macromedia.png LibPNG version 1.0 (1.0) <2DBA0A3F-4F01-7474-0FED-3021382D635F /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/LibPNG.framework/Versions/A/LibPNG 0x82e9000 - 0x82f7feb +com.macromedia.zlib ZLib version 1.0 (1.0) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ZLib.framework/Versions/A/ZLib 0x82fc000 - 0x8300ffd +com.yourcompany.yourcocoaframework ??? (1.0) <7EF7A82E-0AAE-0022-3B15-7C50F1C550C1 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ASEFramework.framework/Versions/A/ASEFramework 0x8305000 - 0x830cff2 +com.adobe.boost_threads.framework boost_threads version 5.0.0 (5.0.0.0) /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_threads.framework/Versions/A/boost_threads 0x831c000 - 0x8322fef +com.adobe.boost_date_time.framework boost_date_time version 5.0.0 (5.0.0.0) <8837A972-1EBE-CAA9-473A-CD157F17163D /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_date_time.framework/Versions/A/boost_date_time 0x8333000 - 0x83b0fff +AdobeOwlCanvas ??? (???) <65B2E680-4F43-BE46-2290-3500758D1BF7 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeOwlCanvas.framework/Versions/A/AdobeOwlCanvas 0x83cc000 - 0x83d7ff3 +com.adobe.boost_filesystem.framework boost_filesystem version 5.0.0 (5.0.0.0) <90B8B4E3-6C44-D110-1545-1A34EB14B22D /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_filesystem.framework/Versions/A/boost_filesystem 0x83eb000 - 0x83edffb +com.adobe.boost_system.framework boost_system version 5.0.0 (5.0.0.0) <0C4D56E8-9593-4C4A-4A7E-BEAEDE1CA131 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_system.framework/Versions/A/boost_system ... E86745B94A4B /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontParser.dylib 0x9984a000 - 0x9989aff7 com.apple.framework.familycontrols 2.0.2 (2020) /System/Library/PrivateFrameworks/FamilyControls.framework/Versions/A/FamilyControls 0x99a6e000 - 0x99a6fff7 com.apple.audio.units.AudioUnit 1.6.5 (1.6.5) /System/Library/Frameworks/AudioUnit.framework/Versions/A/AudioUnit 0x99a72000 - 0x99a86ffb com.apple.speech.synthesis.framework 3.10.35 (3.10.35) <9F5CE4F7-D05C-8C14-4B76-E43D07A8A680 /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis 0xb0000000 - 0xb000fff8 +com.adobe.ahclientframework 1.5.0.30 (1.5.0.30) <24B39C2F-79B0-BDE3-C6D0-1F0E943070C7 /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ahclient.framework/Versions/A/ahclient 0xffff0000 - 0xffff1fff libSystem.B.dylib ??? (???) <62291026-D016-705D-DC1E-FC2B09D47DE5 /usr/lib/libSystem.B.dylib

    Read the article

  • MacBook OS X applications crashing on startup

    - by Ryan Doom
    Closed my computer last night, went home. Opened it and it had restarted. Now when I open a couple programs such as Adobe Fireworks or Appcelerator Titanium they throw up a nasty error like below. Other programs (Chrome, Firefox, Textmate, Versions) work fine. Any thoughts on this? I haven't owned my MacBook long so I'm not even aware of the right tools or places to look to track this down. Any help would be most appreciated. It's making it hard to get my work done :] If it helps at all both those programs were probably open when it restarted. From the look of it I'm not sure if it's a permissions error or something? I completely re-installed one of the applications (Appcelerator Titanium). Didn't seem to help. Process: Adobe Fireworks CS5 [1044] Path: /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/Adobe Fireworks CS5 Identifier: com.macromedia.fireworks Version: Adobe Fireworks CS5 version 11.0.0.484 (11.0.0) Code Type: X86 (Native) Parent Process: launchd [87] Date/Time: 2011-02-18 09:45:47.689 -0500 OS Version: Mac OS X 10.6.6 (10J567) Report Version: 6 Interval Since Last Report: 12983 sec Crashes Since Last Report: 6 Per-App Interval Since Last Report: 325365 sec Per-App Crashes Since Last Report: 4 Anonymous UUID: D16EAFE7-2F04-44D4-A984-5902A6EF8943 Exception Type: EXC_BAD_ACCESS (SIGBUS) Exception Codes: KERN_PROTECTION_FAILURE at 0x00000000b0327ff8 Crashed Thread: 7 Thread 0: Dispatch queue: com.apple.main-thread 0 libSystem.B.dylib 0x97dd0142 semaphore_wait_signal_trap + 10 1 libSystem.B.dylib 0x97dd5c46 pthread_mutex_lock + 490 2 libstdc++.6.dylib 0x91887559 __gnu_cxx::__recursive_mutex::lock() + 17 3 libstdc++.6.dylib 0x918874e6 __cxa_guard_acquire + 68 4 libTrueTypeScaler.dylib 0x91c92ab3 TTScalerInfo() + 50 5 libFontParser.dylib 0x9979a5f1 TTrueTypeScaler::CreateTrueTypeScaler() + 43 6 libSystem.B.dylib 0x97dee900 pthread_once + 82 7 libFontParser.dylib 0x9979a575 TTrueTypeScaler::GetTrueTypeScaler() + 47 8 libFontParser.dylib 0x9979a520 TTrueTypeScaler::TTrueTypeScaler(TScalerStrike const&) + 26 9 libFontParser.dylib 0x9979a4be TFontScaler::CreateFontScaler(TScalerStrike const&) + 52 10 libFontParser.dylib 0x9979bd93 FPFontGetGlyphsForUnichars + 344 11 com.apple.CoreText 0x98255cfe TBaseFont::CalculateFontMetrics(bool) const + 342 12 com.apple.CoreText 0x98255b55 TBaseFont::InitFontMetrics() const + 51 13 com.apple.CoreText 0x98255959 TBaseFont::GetStrikeMetrics(float, CGAffineTransform const*, bool) const + 81 14 com.apple.CoreText 0x982558cd TFont::InitStrikeMetrics() const + 55 15 com.apple.CoreText 0x982592cf CTFontGetAscent + 49 16 com.apple.AppKit 0x989f5d08 __NSFontInstanceInfoInitializeMetricsInfo + 48 17 com.apple.AppKit 0x989f5cbc -[__NSSharedFontInstanceInfo _defaultLineHeight:] + 40 18 com.apple.AppKit 0x98f3c5e8 +[NSStringDrawingTextStorage _fastDrawString:attributes:length:inRect:graphicsContext:baselineRendering:usesFontLeading:usesScreenFont:typesetterBehavior:paragraphStyle:lineBreakMode:boundingRect:padding:scrollable:] + 2041 19 com.apple.AppKit 0x98abd2d9 _NSStringDrawingCore + 1555 20 com.apple.AppKit 0x98abca8b _NSDrawTextCell + 3465 21 com.apple.AppKit 0x98ac6185 -[NSTextFieldCell drawInteriorWithFrame:inView:] + 764 22 com.apple.AppKit 0x98ac5d26 -[NSTextFieldCell drawWithFrame:inView:] + 816 23 com.apple.AppKit 0x98ac03de -[NSControl drawRect:] + 589 24 com.apple.AppKit 0x98ab882a -[NSView _drawRect:clip:] + 3510 25 com.apple.AppKit 0x98ab74c8 -[NSView _recursiveDisplayAllDirtyWithLockFocus:visRect:] + 1600 26 com.apple.AppKit 0x98ab77fd -[NSView _recursiveDisplayAllDirtyWithLockFocus:visRect:] + 2421 27 com.apple.AppKit 0x98ab77fd -[NSView _recursiveDisplayAllDirtyWithLockFocus:visRect:] + 2421 28 com.apple.AppKit 0x98ab59e7 -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:topView:] + 711 29 com.apple.AppKit 0x98b54aa3 -[NSNextStepFrame _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:topView:] + 311 30 com.apple.AppKit 0x98ab1ea2 -[NSView _displayRectIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:] + 3309 31 com.apple.AppKit 0x98a12a57 -[NSView displayIfNeeded] + 818 32 com.apple.AppKit 0x989c6661 -[NSNextStepFrame displayIfNeeded] + 98 33 com.apple.AppKit 0x98b55390 -[NSWindow display] + 75 34 com.macromedia.fireworks 0x00bade98 0x1000 + 12242584 35 com.macromedia.fireworks 0x0089f778 0x1000 + 9037688 36 libPowerPlant2.dylib 0x08109722 FW_PowerPlant::LCarbonApp::Run() + 54 37 com.macromedia.fireworks 0x008a138c 0x1000 + 9044876 38 com.macromedia.fireworks 0x00003596 0x1000 + 9622 Thread 1: Dispatch queue: com.apple.libdispatch-manager 0 libSystem.B.dylib 0x97df6982 kevent + 10 1 libSystem.B.dylib 0x97df709c _dispatch_mgr_invoke + 215 2 libSystem.B.dylib 0x97df6559 _dispatch_queue_invoke + 163 3 libSystem.B.dylib 0x97df62fe _dispatch_worker_thread2 + 240 4 libSystem.B.dylib 0x97df5d81 _pthread_wqthread + 390 5 libSystem.B.dylib 0x97df5bc6 start_wqthread + 30 Thread 2: 0 libSystem.B.dylib 0x97df5a12 __workq_kernreturn + 10 1 libSystem.B.dylib 0x97df5fa8 _pthread_wqthread + 941 2 libSystem.B.dylib 0x97df5bc6 start_wqthread + 30 Thread 3: 0 libSystem.B.dylib 0x97dd015a semaphore_timedwait_signal_trap + 10 1 libSystem.B.dylib 0x97dfdce5 _pthread_cond_wait + 1066 2 libSystem.B.dylib 0x97e2cac8 pthread_cond_timedwait_relative_np + 47 3 ...ple.CoreServices.CarbonCore 0x97af4ecd TSWaitOnConditionTimedRelative + 242 4 ...ple.CoreServices.CarbonCore 0x97af4c0b TSWaitOnSemaphoreCommon + 511 5 ...ple.CoreServices.CarbonCore 0x97b18e33 TimerThread + 97 6 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 7 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 4: 0 libSystem.B.dylib 0x97dd00fa mach_msg_trap + 10 1 libSystem.B.dylib 0x97dd0867 mach_msg + 68 2 ...ple.CoreServices.CarbonCore 0x97b9e0d0 YieldToThread + 446 3 ...ple.CoreServices.CarbonCore 0x97b9e1d3 SetThreadState + 134 4 ...ple.CoreServices.CarbonCore 0x97b9e28e SetThreadStateEndCritical + 111 5 libPowerPlant2.dylib 0x0811ab51 FW_PowerPlant::LThread::SemWait(FW_PowerPlant::LSemaphore*, long, QHdr&, unsigned char&) + 119 6 libPowerPlant2.dylib 0x08119b07 FW_PowerPlant::LSemaphore::BlockThread(long) + 61 7 libPowerPlant2.dylib 0x08119b6d FW_PowerPlant::LSemaphore::Wait(long) + 71 8 libPowerPlant2.dylib 0x0811af70 FW_PowerPlant::LThread::Cleanup::Run() + 32 9 libPowerPlant2.dylib 0x0811b94e FW_PowerPlant::LThread::DoEntry(void*) + 30 10 ...ple.CoreServices.CarbonCore 0x97b9e85f CooperativeThread + 309 11 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 12 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 5: 0 libSystem.B.dylib 0x97dd0142 semaphore_wait_signal_trap + 10 1 libSystem.B.dylib 0x97dfdcfc _pthread_cond_wait + 1089 2 libSystem.B.dylib 0x97e4646f pthread_cond_wait + 48 3 com.adobe.amt.services 0x1dd73126 AMTConditionLock::LockWhenCondition(int) + 46 4 com.adobe.amt.services 0x1dd6bdb0 _AMTThreadedPCDService::PCDThreadWorker(_AMTThreadedPCDService*) + 116 5 com.adobe.amt.services 0x1dd7318c AMTThread::Worker(void*) + 24 6 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 7 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 6: 0 libSystem.B.dylib 0x97dfe0a6 __semwait_signal + 10 1 libSystem.B.dylib 0x97dfdd62 _pthread_cond_wait + 1191 2 libSystem.B.dylib 0x97dff9f8 pthread_cond_wait$UNIX2003 + 73 3 ...ple.CoreServices.CarbonCore 0x97b0951e TSWaitOnCondition + 126 4 ...ple.CoreServices.CarbonCore 0x97af4ea5 TSWaitOnConditionTimedRelative + 202 5 ...ple.CoreServices.CarbonCore 0x97af0873 MPWaitOnQueue + 250 6 com.macromedia.fireworks 0x00ae43cf 0x1000 + 11416527 7 ...ple.CoreServices.CarbonCore 0x97ad485a PrivateMPEntryPoint + 68 8 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 9 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 7 Crashed: 0 libstdc++.6.dylib 0x9184e00c std::basic_ofstream<char, std::char_traits<char> >::open(char const*, std::_Ios_Openmode) + 16 1 libstdc++.6.dylib 0x9184fe9b std::basic_ifstream<char, std::char_traits<char> >::basic_ifstream(char const*, std::_Ios_Openmode) + 211 2 ...pdaterNotificationFramework 0x1e824779 ESDifstream::ESDifstream(std::string const&, char const*, std::_Ios_Openmode) + 73 3 ...pdaterNotificationFramework 0x1e821b6a esd::ExpatDOMBuilder<esd::XMLDocumentNode>::ParseFile(std::string const&, bool) + 96 4 ...pdaterNotificationFramework 0x1e822da4 esd::PrefsWriter::SetPrefsPath(std::string const&) + 206 5 ...pdaterNotificationFramework 0x1e8449b3 AdobeUpdaterPrefs::AdobeUpdaterPrefs() + 8609 6 ...pdaterNotificationFramework 0x1e8459f4 AdobeUpdaterPrefs::GetAdobeUpdaterPrefs() + 68 7 ...pdaterNotificationFramework 0x1e820728 UpdaterNotificationsImpl::InitLogFile() + 48 8 ...pdaterNotificationFramework 0x1e820d49 UpdaterNotificationsImpl::Instance() + 53 9 ...pdaterNotificationFramework 0x1e823638 UpdaterNotificationsIsUpdaterEnabled + 22 10 com.adobe.amt.services 0x1dd69d15 _AMTAUMService::IsUpdaterEnabled(T_CSUStatusMajor*, int*) + 359 11 com.adobe.amtlib 0x01f5501c AMTAUMServiceIsUpdaterEnabled + 290 12 com.adobe.amtlib 0x01f1f789 AMTImpl::CallMenuEnablers() + 71 13 com.adobe.amtlib 0x01f260fa AMTImpl::DoLaunchWorkflow(AMTImpl::LaunchSequence) + 1664 14 com.adobe.amtlib 0x01f26a5d AMTImpl::DoValidateWorkflow(AMTImpl::LaunchSequence) + 293 15 com.adobe.amtlib 0x01f26cf5 AMTImpl::DoPreValidateWorkflow() + 119 16 com.adobe.amtlib 0x01f26e71 AMTImpl::ServiceLoaderThread(void*) + 45 17 com.adobe.amtlib 0x01f54c48 AMTThread::Worker(void*) + 24 18 libSystem.B.dylib 0x97dfd85d _pthread_start + 345 19 libSystem.B.dylib 0x97dfd6e2 thread_start + 34 Thread 7 crashed with X86 Thread State (32-bit): eax: 0x00000016 ebx: 0x098c9a00 ecx: 0xa013dfc0 edx: 0x00000003 edi: 0x098c9a08 esi: 0x098c9c0c ebp: 0xb03a7448 esp: 0xb0327ff0 ss: 0x0000001f efl: 0x00010202 eip: 0x9184e00c cs: 0x00000017 ds: 0x0000001f es: 0x0000001f fs: 0x0000001f gs: 0x00000037 cr2: 0xb0327ff8 Binary Images: 0x1000 - 0x1448ff1 +com.macromedia.fireworks Adobe Fireworks CS5 version 11.0.0.484 (11.0.0) <38213EBD-FDB0-FC20-40E8-87935A5386BB> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/Adobe Fireworks CS5 0x1e76000 - 0x1ec9ffb +com.adobe.headlights.LogSessionFramework ??? (2.0.1.011) <4F2BFF03-01D2-A07D-E5E2-7F88D4C2DEC4> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/LogSession.framework/Versions/A/LogSession 0x1f11000 - 0x1f77ffb +com.adobe.amtlib amtlib 3.0.0.64 (3.0.0.64) <DD471011-9120-1BC2-F1B5-D6FF09D0859F> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/amtlib.framework/Versions/A/amtlib 0x1fa7000 - 0x2146fe7 +com.adobe.owl AdobeOwl version 3.0.81 (3.0.81) <9C261D9E-9BD7-5DE6-5676-AEEF4828D17B> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeOwl.framework/Versions/A/AdobeOwl 0x21af000 - 0x22e7fe7 +WRServices ??? (???) <52CE5B97-1E6A-92A2-EA70-93511AB7EA2E> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/WRServices.framework/Versions/A/WRServices 0x232d000 - 0x239afef +FileInfo ??? (???) <4A4C74F9-CA83-B174-F56D-F7671DC61389> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/FileInfo.framework/Versions/A/FileInfo 0x23b5000 - 0x23dbff6 +AdobeAXE8SharedExpat ??? (???) <5848BBCE-3A3E-66EE-5527-97A96F0CA4CC> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeAXE8SharedExpat.framework/Versions/A/AdobeAXE8SharedExpat 0x23ec000 - 0x2407fff +AdobeBIB ??? (???) <3B3092DC-A296-9D1C-1922-D20E6A5A7D7E> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeBIB.framework/Versions/A/AdobeBIB 0x2411000 - 0x2469ff7 +AdobeXMP ??? (???) <73329999-C364-2451-6574-4D0277057D19> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeXMP.framework/Versions/A/AdobeXMP 0x2478000 - 0x2aa6fe7 +AdobeAGM ??? (???) <91D37E54-E985-47E1-2696-0BD7E4183132> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeAGM.framework/Versions/A/AdobeAGM 0x2c04000 - 0x2d18fff +AdobeACE ??? (???) <DD291A17-ECF4-FE20-5837-AC1F5BC76940> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeACE.framework/Versions/A/AdobeACE 0x2d3b000 - 0x302dff7 +AdobeCoolType ??? (???) <9FDD596D-9824-2BB9-5DA2-25DACAB6A324> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeCoolType.framework/Versions/A/AdobeCoolType 0x30b5000 - 0x30d6ff7 +AdobeBIBUtils ??? (???) <E1FAA7A3-E807-DE5A-1F68-7A53780E8202> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeBIBUtils.framework/Versions/A/AdobeBIBUtils 0x30e2000 - 0x311efff +AdobeARE ??? (???) <76851E91-2381-5D05-742C-BB24E4BAD276> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeARE.framework/Versions/A/AdobeARE 0x3127000 - 0x34ffff7 +AdobeMPS ??? (???) <13614867-4D80-EB74-FA7F-6136492478BA> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeMPS.framework/Versions/A/AdobeMPS 0x362e000 - 0x3c62feb +AdobePDFL ??? (???) <49D6D58A-1EBB-424A-4CB0-8F9691E0991D> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobePDFL.framework/Versions/A/AdobePDFL 0x3d8e000 - 0x4ad1fff +com.adobe.psl AdobePSL 12.0.0.7524 (12.0.0.7524) <CFBCB19A-03F7-D095-1F48-8D68F05A25C5> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobePSL.framework/Versions/A/AdobePSL 0x4e10000 - 0x4e9aff7 +com.adobe.AdobeScCore ScCore 4.1.7 (4.1.7.5522) <053A109E-3E3E-D3EE-7186-4920D927D2AD> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeScCore.framework/Versions/A/AdobeScCore 0x4edd000 - 0x4fc0fef +AdobePDFPort ??? (???) <A2E6DCF7-283F-09E9-53AE-D5D84D020469> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobePDFPort.framework/Versions/A/AdobePDFPort 0x4ff5000 - 0x4ff8ff8 +com.adobe.ape.shim adbeape version 3.1.65.7508 (3.1.65.7508) <FFDDAB7A-220F-7344-F12B-010CA0C41DAB> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/adbeape.framework/Versions/A/adbeape 0x4ffe000 - 0x508fff7 +libicucnv.dylib.36.0 36.0.0 (compatibility 36.0.0) <581475CC-C039-1B42-49BA-71811D8B4E15> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUConverter.framework/Versions/3.6/libicucnv.dylib.36.0 0x50ae000 - 0x5a5efff +libicudata.dylib.36.0 36.0.0 (compatibility 36.0.0) <02108DEA-3DD2-14BE-DAEB-BE522B619C1D> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUData.framework/Versions/3.6/libicudata.dylib.36.0 0x5a61000 - 0x5b2eff3 +libicui18n.dylib.36.0 36.0.0 (compatibility 36.0.0) <08F15219-7F35-574E-7725-1ACAA1B18A00> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUInternationalization.framework/Versions/3.6/libicui18n.dylib.36.0 0x5b91000 - 0x5c6bfef +libicuuc.dylib.36.0 36.0.0 (compatibility 36.0.0) <5EE72009-40B3-7FB7-3A49-576AEDE0D400> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ICUUnicode.framework/Versions/3.6/libicuuc.dylib.36.0 0x5cab000 - 0x6a36fe7 +com.adobe.illustrator 382 (15.0.0) <64F68532-0311-6BBA-1F50-246CAF917549> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AILib.framework/Versions/A/AILib 0x781b000 - 0x785ffff +com.adobe.illustrator.aiport AIPort version 1.0 (1.0) <69EDC44E-D7BB-A259-282D-C42725AE0E26> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AIPort.framework/Versions/A/AIPort 0x78c2000 - 0x7908fff +FilterPort ??? (???) <23FAE9D1-9376-1E71-21F7-D3EB2BFD50EE> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/FilterPort.framework/Versions/A/FilterPort 0x797d000 - 0x797dfff +SPBasic ??? (???) <5D1760D8-C910-C641-0BC9-CF74A1A5190D> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/SPBasic.framework/Versions/A/SPBasic 0x7981000 - 0x7b67ff7 +com.adobe.linguistic.LinguisticManager 5.0.0 (11309) <CA1D50A3-F965-F8B2-76B9-007F290C5791> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeLinguistic.framework/Versions/3/AdobeLinguistic 0x7bf5000 - 0x7cc2fe7 +AdobeAXEDOMCore ??? (???) <F76D74DC-FD5A-9783-C447-2E58773DA7E1> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeAXEDOMCore.framework/Versions/A/AdobeAXEDOMCore 0x7d31000 - 0x7ea9ffb +com.adobe.PlugPlug 2.0.0.746 (2.0.0.746) <08AD22E3-34C0-6749-E497-616C66A246AD> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/PlugPlug.framework/Versions/A/PlugPlug 0x7f4d000 - 0x7f6afef +libCurl.dylib ??? (???) <1BA6E2DE-EF14-D50A-4697-035AE07875D7> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/libCurl.dylib 0x7f72000 - 0x7f88ff4 +libChar16.dylib ??? (???) <19B0479C-72B1-EE14-6385-7F655DEC0F02> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/libChar16.dylib 0x7f90000 - 0x7fb3fe0 +libCoreTypes.dylib ??? (???) <F5306147-FFBD-2826-D356-B26258DBFA09> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/MacOS/libCoreTypes.dylib 0x7fc3000 - 0x7fcaffc com.apple.carbonframeworktemplate 1.0 (1.0) <0D270CC7-B715-943E-2B4F-5C9B5775505A> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/NetIO.framework/Versions/A/NetIO 0x7fd6000 - 0x7fd9fff +Dioxide.dylib ??? (???) <BCE94F23-4CCA-20FB-79A8-DE7925879DCD> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/Dioxide.dylib 0x7fe1000 - 0x7fe7ffc +libfwutility.dylib ??? (???) <6A723D9E-A60B-56EE-2B8D-B91991793749> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/libfwutility.dylib 0x7fee000 - 0x803efff +com.macromedia.javascript Javascript version 1.0 (1.0) <540CB029-3946-8E41-BD91-AED6F73C86B7> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/Javascript.framework/Versions/A/Javascript 0x8053000 - 0x8060fff +com.macromedia.moa Moa version 1.0 (1.0) <3C4B7F42-5A5D-78E7-B1DC-DAA06A99CCB2> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/Moa.framework/Versions/A/Moa 0x8069000 - 0x8070fff +com.macromedia.morefiles MoreFiles version 1.0 (1.0) <36115C66-79A3-5DB9-B36B-8D655B46FC76> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/MoreFiles.framework/Versions/A/MoreFiles 0x8077000 - 0x815bfe3 +libPowerPlant2.dylib ??? (???) <964FB3D7-B7EE-94EB-FD95-4AE90C657A4A> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/libPowerPlant2.dylib 0x828e000 - 0x8294ffb +com.macromedia.testframework 1.0 (1.0) <ED14FA00-1C6F-D433-1EEB-833BB4402B2B> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/uwchar.framework/Versions/A/uwchar 0x8298000 - 0x829cffc +com.adobe.AdobeCrashReporter 3.0 (3.0.20100302) <E6437929-0E69-8A56-E69F-F64305E82DD9> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeCrashReporter.framework/Versions/A/AdobeCrashReporter 0x82a3000 - 0x82bbfef +libgiff.dylib ??? (???) <8F90552B-3D11-2B1E-D1BA-A109FEB99969> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/libgiff.dylib 0x82c3000 - 0x82e1fe7 +com.macromedia.png LibPNG version 1.0 (1.0) <2DBA0A3F-4F01-7474-0FED-3021382D635F> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/LibPNG.framework/Versions/A/LibPNG 0x82e9000 - 0x82f7feb +com.macromedia.zlib ZLib version 1.0 (1.0) <EEA4CFAF-A748-FA72-91F0-ADE7A1BE9FA7> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ZLib.framework/Versions/A/ZLib 0x82fc000 - 0x8300ffd +com.yourcompany.yourcocoaframework ??? (1.0) <7EF7A82E-0AAE-0022-3B15-7C50F1C550C1> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ASEFramework.framework/Versions/A/ASEFramework 0x8305000 - 0x830cff2 +com.adobe.boost_threads.framework boost_threads version 5.0.0 (5.0.0.0) <F966C78A-3CC1-8678-B3B7-B0A2B118343A> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_threads.framework/Versions/A/boost_threads 0x831c000 - 0x8322fef +com.adobe.boost_date_time.framework boost_date_time version 5.0.0 (5.0.0.0) <8837A972-1EBE-CAA9-473A-CD157F17163D> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_date_time.framework/Versions/A/boost_date_time 0x8333000 - 0x83b0fff +AdobeOwlCanvas ??? (???) <65B2E680-4F43-BE46-2290-3500758D1BF7> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/AdobeOwlCanvas.framework/Versions/A/AdobeOwlCanvas 0x83cc000 - 0x83d7ff3 +com.adobe.boost_filesystem.framework boost_filesystem version 5.0.0 (5.0.0.0) <90B8B4E3-6C44-D110-1545-1A34EB14B22D> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_filesystem.framework/Versions/A/boost_filesystem 0x83eb000 - 0x83edffb +com.adobe.boost_system.framework boost_system version 5.0.0 (5.0.0.0) <0C4D56E8-9593-4C4A-4A7E-BEAEDE1CA131> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/boost_system.framework/Versions/A/boost_system ... E86745B94A4B> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontParser.dylib 0x9984a000 - 0x9989aff7 com.apple.framework.familycontrols 2.0.2 (2020) <AF7F86F1-F7BF-CBA8-7A4A-D8F7A19F9601> /System/Library/PrivateFrameworks/FamilyControls.framework/Versions/A/FamilyControls 0x99a6e000 - 0x99a6fff7 com.apple.audio.units.AudioUnit 1.6.5 (1.6.5) <BE4C2495-B758-AD22-DCC0-56A6791E948E> /System/Library/Frameworks/AudioUnit.framework/Versions/A/AudioUnit 0x99a72000 - 0x99a86ffb com.apple.speech.synthesis.framework 3.10.35 (3.10.35) <9F5CE4F7-D05C-8C14-4B76-E43D07A8A680> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis 0xb0000000 - 0xb000fff8 +com.adobe.ahclientframework 1.5.0.30 (1.5.0.30) <24B39C2F-79B0-BDE3-C6D0-1F0E943070C7> /Applications/Adobe Fireworks CS5/Adobe Fireworks CS5.app/Contents/Frameworks/ahclient.framework/Versions/A/ahclient 0xffff0000 - 0xffff1fff libSystem.B.dylib ??? (???) <62291026-D016-705D-DC1E-FC2B09D47DE5> /usr/lib/libSystem.B.dylib If you prefer, Here are the crashes on Pastebin: Crash 1 (Fireworks) Crash 2 (Appcelerator Titanium)

    Read the article

  • Help getting frame rate (fps) up in Python + Pygame

    - by Jordan Magnuson
    I am working on a little card-swapping world-travel game that I sort of envision as a cross between Bejeweled and the 10 Days geography board games. So far the coding has been going okay, but the frame rate is pretty bad... currently I'm getting low 20's on my Core 2 Duo. This is a problem since I'm creating the game for Intel's March developer competition, which is squarely aimed at netbooks packing underpowered Atom processors. Here's a screen from the game: ![www.necessarygames.com/my_games/betraveled/betraveled-fps.png][1] I am very new to Python and Pygame (this is the first thing I've used them for), and am sadly lacking in formal CS training... which is to say that I think there are probably A LOT of bad practices going on in my code, and A LOT that could be optimized. If some of you older Python hands wouldn't mind taking a look at my code and seeing if you can't find any obvious areas for optimization, I would be extremely grateful. You can download the full source code here: http://www.necessarygames.com/my_games/betraveled/betraveled_src0328.zip Compiled exe here: www.necessarygames.com/my_games/betraveled/betraveled_src0328.zip One thing I am concerned about is my event manager, which I feel may have some performance wholes in it, and another thing is my rendering... I'm pretty much just blitting everything to the screen all the time (see the render routines in my game_components.py below); I recently found out that you should only update the areas of the screen that have changed, but I'm still foggy on how that accomplished exactly... could this be a huge performance issue? Any thoughts are much appreciated! As usual, I'm happy to "tip" you for your time and energy via PayPal. Jordan Here are some bits of the source: Main.py #Remote imports import pygame from pygame.locals import * #Local imports import config import rooms from event_manager import * from events import * class RoomController(object): """Controls which room is currently active (eg Title Screen)""" def __init__(self, screen, ev_manager): self.room = None self.screen = screen self.ev_manager = ev_manager self.ev_manager.register_listener(self) self.room = self.set_room(config.room) def set_room(self, room_const): #Unregister old room from ev_manager if self.room: self.room.ev_manager.unregister_listener(self.room) self.room = None #Set new room based on const if room_const == config.TITLE_SCREEN: return rooms.TitleScreen(self.screen, self.ev_manager) elif room_const == config.GAME_MODE_ROOM: return rooms.GameModeRoom(self.screen, self.ev_manager) elif room_const == config.GAME_ROOM: return rooms.GameRoom(self.screen, self.ev_manager) elif room_const == config.HIGH_SCORES_ROOM: return rooms.HighScoresRoom(self.screen, self.ev_manager) def notify(self, event): if isinstance(event, ChangeRoomRequest): if event.game_mode: config.game_mode = event.game_mode self.room = self.set_room(event.new_room) def render(self, surface): self.room.render(surface) #Run game def main(): pygame.init() screen = pygame.display.set_mode(config.screen_size) ev_manager = EventManager() spinner = CPUSpinnerController(ev_manager) room_controller = RoomController(screen, ev_manager) pygame_event_controller = PyGameEventController(ev_manager) spinner.run() # this runs the main function if this script is called to run. # If it is imported as a module, we don't run the main function. if __name__ == "__main__": main() event_manager.py #Remote imports import pygame from pygame.locals import * #Local imports import config from events import * def debug( msg ): print "Debug Message: " + str(msg) class EventManager: #This object is responsible for coordinating most communication #between the Model, View, and Controller. def __init__(self): from weakref import WeakKeyDictionary self.listeners = WeakKeyDictionary() self.eventQueue= [] self.gui_app = None #---------------------------------------------------------------------- def register_listener(self, listener): self.listeners[listener] = 1 #---------------------------------------------------------------------- def unregister_listener(self, listener): if listener in self.listeners: del self.listeners[listener] #---------------------------------------------------------------------- def post(self, event): if isinstance(event, MouseButtonLeftEvent): debug(event.name) #NOTE: copying the list like this before iterating over it, EVERY tick, is highly inefficient, #but currently has to be done because of how new listeners are added to the queue while it is running #(eg when popping cards from a deck). Should be changed. See: http://dr0id.homepage.bluewin.ch/pygame_tutorial08.html #and search for "Watch the iteration" for listener in list(self.listeners): #NOTE: If the weakref has died, it will be #automatically removed, so we don't have #to worry about it. listener.notify(event) #------------------------------------------------------------------------------ class PyGameEventController: """...""" def __init__(self, ev_manager): self.ev_manager = ev_manager self.ev_manager.register_listener(self) self.input_freeze = False #---------------------------------------------------------------------- def notify(self, incoming_event): if isinstance(incoming_event, UserInputFreeze): self.input_freeze = True elif isinstance(incoming_event, UserInputUnFreeze): self.input_freeze = False elif isinstance(incoming_event, TickEvent): #Share some time with other processes, so we don't hog the cpu pygame.time.wait(5) #Handle Pygame Events for event in pygame.event.get(): #If this event manager has an associated PGU GUI app, notify it of the event if self.ev_manager.gui_app: self.ev_manager.gui_app.event(event) #Standard event handling for everything else ev = None if event.type == QUIT: ev = QuitEvent() elif event.type == pygame.MOUSEBUTTONDOWN and not self.input_freeze: if event.button == 1: #Button 1 pos = pygame.mouse.get_pos() ev = MouseButtonLeftEvent(pos) elif event.type == pygame.MOUSEMOTION: pos = pygame.mouse.get_pos() ev = MouseMoveEvent(pos) #Post event to event manager if ev: self.ev_manager.post(ev) #------------------------------------------------------------------------------ class CPUSpinnerController: def __init__(self, ev_manager): self.ev_manager = ev_manager self.ev_manager.register_listener(self) self.clock = pygame.time.Clock() self.cumu_time = 0 self.keep_going = True #---------------------------------------------------------------------- def run(self): if not self.keep_going: raise Exception('dead spinner') while self.keep_going: time_passed = self.clock.tick() fps = self.clock.get_fps() self.cumu_time += time_passed self.ev_manager.post(TickEvent(time_passed, fps)) if self.cumu_time >= 1000: self.cumu_time = 0 self.ev_manager.post(SecondEvent()) pygame.quit() #---------------------------------------------------------------------- def notify(self, event): if isinstance(event, QuitEvent): #this will stop the while loop from running self.keep_going = False rooms.py #Remote imports import pygame #Local imports import config import continents from game_components import * from my_gui import * from pgu import high class Room(object): def __init__(self, screen, ev_manager): self.screen = screen self.ev_manager = ev_manager self.ev_manager.register_listener(self) def notify(self, event): if isinstance(event, TickEvent): pygame.display.set_caption('FPS: ' + str(int(event.fps))) self.render(self.screen) pygame.display.update() def get_highs_table(self): fname = 'high_scores.txt' highs_table = None config.all_highs = high.Highs(fname) if config.game_mode == config.TIME_CHALLENGE: if config.difficulty == config.EASY: highs_table = config.all_highs['time_challenge_easy'] if config.difficulty == config.MED_DIF: highs_table = config.all_highs['time_challenge_med'] if config.difficulty == config.HARD: highs_table = config.all_highs['time_challenge_hard'] if config.difficulty == config.SUPER: highs_table = config.all_highs['time_challenge_super'] elif config.game_mode == config.PLAN_AHEAD: pass return highs_table class TitleScreen(Room): def __init__(self, screen, ev_manager): Room.__init__(self, screen, ev_manager) self.background = pygame.image.load('assets/images/interface/background.jpg').convert() #Initialize #--------------------------------------- self.gui_form = gui.Form() self.gui_app = gui.App(config.gui_theme) self.ev_manager.gui_app = self.gui_app c = gui.Container(align=0,valign=0) #Quit Button #--------------------------------------- b = StartGameButton(ev_manager=self.ev_manager) c.add(b, 0, 0) self.gui_app.init(c) def render(self, surface): surface.blit(self.background, (0, 0)) #GUI self.gui_app.paint(surface) class GameModeRoom(Room): def __init__(self, screen, ev_manager): Room.__init__(self, screen, ev_manager) self.background = pygame.image.load('assets/images/interface/background.jpg').convert() self.create_gui() #Create pgu gui elements def create_gui(self): #Setup #--------------------------------------- self.gui_form = gui.Form() self.gui_app = gui.App(config.gui_theme) self.ev_manager.gui_app = self.gui_app c = gui.Container(align=0,valign=-1) #Mode Relaxed Button #--------------------------------------- b = GameModeRelaxedButton(ev_manager=self.ev_manager) self.b = b print b.rect c.add(b, 0, 200) #Mode Time Challenge Button #--------------------------------------- b = TimeChallengeButton(ev_manager=self.ev_manager) self.b = b print b.rect c.add(b, 0, 250) #Mode Think Ahead Button #--------------------------------------- # b = PlanAheadButton(ev_manager=self.ev_manager) # self.b = b # print b.rect # c.add(b, 0, 300) #Initialize #--------------------------------------- self.gui_app.init(c) def render(self, surface): surface.blit(self.background, (0, 0)) #GUI self.gui_app.paint(surface) class GameRoom(Room): def __init__(self, screen, ev_manager): Room.__init__(self, screen, ev_manager) #Game mode #--------------------------------------- self.new_board_timer = None self.game_mode = config.game_mode config.current_highs = self.get_highs_table() self.highs_dialog = None self.game_over = False #Images #--------------------------------------- self.background = pygame.image.load('assets/images/interface/game screen2-1.jpg').convert() self.logo = pygame.image.load('assets/images/interface/logo_small.png').convert_alpha() self.game_over_text = pygame.image.load('assets/images/interface/text_game_over.png').convert_alpha() self.trip_complete_text = pygame.image.load('assets/images/interface/text_trip_complete.png').convert_alpha() self.zoom_game_over = None self.zoom_trip_complete = None self.fade_out = None #Text #--------------------------------------- self.font = pygame.font.Font(config.font_sans, config.interface_font_size) #Create game components #--------------------------------------- self.continent = self.set_continent(config.continent) self.board = Board(config.board_size, self.ev_manager) self.deck = Deck(self.ev_manager, self.continent) self.map = Map(self.continent) self.longest_trip = 0 #Set pos of game components #--------------------------------------- board_pos = (SCREEN_MARGIN[0], 109) self.board.set_pos(board_pos) map_pos = (config.screen_size[0] - self.map.size[0] - SCREEN_MARGIN[0], 57); self.map.set_pos(map_pos) #Trackers #--------------------------------------- self.game_clock = Chrono(self.ev_manager) self.swap_counter = 0 self.level = 0 #Create gui #--------------------------------------- self.create_gui() #Create initial board #--------------------------------------- self.new_board = self.deck.deal_new_board(self.board) self.ev_manager.post(NewBoardComplete(self.new_board)) def set_continent(self, continent_const): #Set continent based on const if continent_const == config.EUROPE: return continents.Europe() if continent_const == config.AFRICA: return continents.Africa() else: raise Exception('Continent constant not recognized') #Create pgu gui elements def create_gui(self): #Setup #--------------------------------------- self.gui_form = gui.Form() self.gui_app = gui.App(config.gui_theme) self.ev_manager.gui_app = self.gui_app c = gui.Container(align=-1,valign=-1) #Timer Progress bar #--------------------------------------- self.timer_bar = None self.time_increase = None self.minutes_left = None self.seconds_left = None self.timer_text = None if self.game_mode == config.TIME_CHALLENGE: self.time_increase = config.time_challenge_start_time self.timer_bar = gui.ProgressBar(config.time_challenge_start_time,0,config.max_time_bank,width=306) c.add(self.timer_bar, 172, 57) #Connections Progress bar #--------------------------------------- self.connections_bar = None self.connections_bar = gui.ProgressBar(0,0,config.longest_trip_needed,width=306) c.add(self.connections_bar, 172, 83) #Quit Button #--------------------------------------- b = QuitButton(ev_manager=self.ev_manager) c.add(b, 950, 20) #Generate Board Button #--------------------------------------- b = GenerateBoardButton(ev_manager=self.ev_manager, room=self) c.add(b, 500, 20) #Board Size? #--------------------------------------- bs = SetBoardSizeContainer(config.BOARD_LARGE, ev_manager=self.ev_manager, board=self.board) c.add(bs, 640, 20) #Fill Board? #--------------------------------------- t = FillBoardCheckbox(config.fill_board, ev_manager=self.ev_manager) c.add(t, 740, 20) #Darkness? #--------------------------------------- t = UseDarknessCheckbox(config.use_darkness, ev_manager=self.ev_manager) c.add(t, 840, 20) #Initialize #--------------------------------------- self.gui_app.init(c) def advance_level(self): self.level += 1 print 'Advancing to next level' print 'New level: ' + str(self.level) if self.timer_bar: print 'Time increase: ' + str(self.time_increase) self.timer_bar.value += self.time_increase self.time_increase = max(config.min_advance_time, int(self.time_increase * 0.9)) self.board = self.new_board self.new_board = None self.zoom_trip_complete = None self.game_clock.unpause() def notify(self, event): #Tick event if isinstance(event, TickEvent): pygame.display.set_caption('FPS: ' + str(int(event.fps))) self.render(self.screen) pygame.display.update() #Wait to deal new board when advancing levels if self.zoom_trip_complete and self.zoom_trip_complete.finished: self.zoom_trip_complete = None self.ev_manager.post(UnfreezeCards()) self.new_board = self.deck.deal_new_board(self.board) self.ev_manager.post(NewBoardComplete(self.new_board)) #New high score? if self.zoom_game_over and self.zoom_game_over.finished and not self.highs_dialog: if config.current_highs.check(self.level) != None: self.zoom_game_over.visible = False data = 'time:' + str(self.game_clock.time) + ',swaps:' + str(self.swap_counter) self.highs_dialog = HighScoreDialog(score=self.level, data=data, ev_manager=self.ev_manager) self.highs_dialog.open() elif not self.fade_out: self.fade_out = FadeOut(self.ev_manager, config.TITLE_SCREEN) #Second event elif isinstance(event, SecondEvent): if self.timer_bar: if not self.game_clock.paused: self.timer_bar.value -= 1 if self.timer_bar.value <= 0 and not self.game_over: self.ev_manager.post(GameOver()) self.minutes_left = self.timer_bar.value / 60 self.seconds_left = self.timer_bar.value % 60 if self.seconds_left < 10: leading_zero = '0' else: leading_zero = '' self.timer_text = ''.join(['Time Left: ', str(self.minutes_left), ':', leading_zero, str(self.seconds_left)]) #Game over elif isinstance(event, GameOver): self.game_over = True self.zoom_game_over = ZoomImage(self.ev_manager, self.game_over_text) #Trip complete event elif isinstance(event, TripComplete): print 'You did it!' self.game_clock.pause() self.zoom_trip_complete = ZoomImage(self.ev_manager, self.trip_complete_text) self.new_board_timer = Timer(self.ev_manager, 2) self.ev_manager.post(FreezeCards()) print 'Room posted newboardcomplete' #Board Refresh Complete elif isinstance(event, BoardRefreshComplete): if event.board == self.board: print 'Longest trip needed: ' + str(config.longest_trip_needed) print 'Your longest trip: ' + str(self.board.longest_trip) if self.board.longest_trip >= config.longest_trip_needed: self.ev_manager.post(TripComplete()) elif event.board == self.new_board: self.advance_level() self.connections_bar.value = self.board.longest_trip self.connection_text = ' '.join(['Connections:', str(self.board.longest_trip), '/', str(config.longest_trip_needed)]) #CardSwapComplete elif isinstance(event, CardSwapComplete): self.swap_counter += 1 elif isinstance(event, ConfigChangeBoardSize): config.board_size = event.new_size elif isinstance(event, ConfigChangeCardSize): config.card_size = event.new_size elif isinstance(event, ConfigChangeFillBoard): config.fill_board = event.new_value elif isinstance(event, ConfigChangeDarkness): config.use_darkness = event.new_value def render(self, surface): #Background surface.blit(self.background, (0, 0)) #Map self.map.render(surface) #Board self.board.render(surface) #Logo surface.blit(self.logo, (10,10)) #Text connection_text = self.font.render(self.connection_text, True, BLACK) surface.blit(connection_text, (25, 84)) if self.timer_text: timer_text = self.font.render(self.timer_text, True, BLACK) surface.blit(timer_text, (25, 64)) #GUI self.gui_app.paint(surface) if self.zoom_trip_complete: self.zoom_trip_complete.render(surface) if self.zoom_game_over: self.zoom_game_over.render(surface) if self.fade_out: self.fade_out.render(surface) class HighScoresRoom(Room): def __init__(self, screen, ev_manager): Room.__init__(self, screen, ev_manager) self.background = pygame.image.load('assets/images/interface/background.jpg').convert() #Initialize #--------------------------------------- self.gui_app = gui.App(config.gui_theme) self.ev_manager.gui_app = self.gui_app c = gui.Container(align=0,valign=0) #High Scores Table #--------------------------------------- hst = HighScoresTable() c.add(hst, 0, 0) self.gui_app.init(c) def render(self, surface): surface.blit(self.background, (0, 0)) #GUI self.gui_app.paint(surface) game_components.py #Remote imports import pygame from pygame.locals import * import random import operator from copy import copy from math import sqrt, floor #Local imports import config from events import * from matrix import Matrix from textrect import render_textrect, TextRectException from hyphen import hyphenator from textwrap2 import TextWrapper ############################## #CONSTANTS ############################## SCREEN_MARGIN = (10, 10) #Colors BLACK = (0, 0, 0) WHITE = (255, 255, 255) RED = (255, 0, 0) YELLOW = (255, 200, 0) #Directions LEFT = -1 RIGHT = 1 UP = 2 DOWN = -2 #Cards CARD_MARGIN = (10, 10) CARD_PADDING = (2, 2) #Card types BLANK = 0 COUNTRY = 1 TRANSPORT = 2 #Transport types PLANE = 0 TRAIN = 1 CAR = 2 SHIP = 3 class Timer(object): def __init__(self, ev_manager, time_left): self.ev_manager = ev_manager self.ev_manager.register_listener(self) self.time_left = time_left self.paused = False def __repr__(self): return str(self.time_left) def pause(self): self.paused = True def unpause(self): self.paused = False def notify(self, event): #Pause Event if isinstance(event, Pause): self.pause() #Unpause Event elif isinstance(event, Unpause): self.unpause() #Second Event elif isinstance(event, SecondEvent): if not self.paused: self.time_left -= 1 class Chrono(object): def __init__(self, ev_manager, start_time=0): self.ev_manager = ev_manager self.ev_manager.register_listener(self) self.time = start_time self.paused = False def __repr__(self): return str(self.time_left) def pause(self): self.paused = True def unpause(self): self.paused = False def notify(self, event): #Pause Event if isinstance(event, Pause): self.pause() #Unpause Event elif isinstance(event, Unpause): self.unpause() #Second Event elif isinstance(event, SecondEvent): if not self.paused: self.time += 1 class Map(object): def __init__(self, continent): self.map_image = pygame.image.load(continent.map).convert_alpha() self.map_text = pygame.image.load(continent.map_text).convert_alpha() self.pos = (0, 0) self.set_color() self.map_image = pygame.transform.smoothscale(self.map_image, config.map_size) self.size = self.map_image.get_size() def set_pos(self, pos): self.pos = pos def set_color(self): image_pixel_array = pygame.PixelArray(self.map_image) image_pixel_array.replace(config.GRAY1, config.COLOR1) image_pixel_array.replace(config.GRAY2, config.COLOR2) image_pixel_array.replace(config.GRAY3, config.COLOR3) image_pixel_array.replace(config.GRAY4, config.COLOR4) image_pixel_array.replace(config.GRAY5, config.COLOR5)

    Read the article

  • Data Modeling: Logical Modeling Exercise

    - by swisscheese
    In trying to learn the art of data storage I have been trying to take in as much solid information as possible. PerformanceDBA posted some really helpful tutorials/examples in the following posts among others: is my data normalized? and Relational table naming convention. I already asked a subset question of this model here. So to make sure I understood the concepts he presented and I have seen elsewhere I wanted to take things a step or two further and see if I am grasping the concepts. Hence the purpose of this post, which hopefully others can also learn from. Everything I present is conceptual to me and for learning rather than applying it in some production system. It would be cool to get some input from PerformanceDBA also since I used his models to get started, but I appreciate all input given from anyone. As I am new to databases and especially modeling I will be the first to admit that I may not always ask the right questions, explain my thoughts clearly, or use the right verbage due to lack of expertise on the subject. So please keep that in mind and feel free to steer me in the right direction if I head off track. If there is enough interest in this I would like to take this from the logical to physical phases to show the evolution of the process and share it here on Stack. I will keep this thread for the Logical Diagram though and start new one for the additional steps. For my understanding I will be building a MySQL DB in the end to run some tests and see if what I came up with actually works. Here is the list of things that I want to capture in this conceptual model. Edit for V1.2 The purpose of this is to list Bands, their members, and the Events that they will be appearing at, as well as offer music and other merchandise for sale Members will be able to match up with friends Members can write reviews on the Bands, their music, and their events. There can only be one review per member on a given item, although they can edit their reviews and history will be maintained. BandMembers will have the chance to write a single Comment on Reviews about the Band they are associated with. Collectively as a Band only one Comment is allowed per Review. Members can then rate all Reviews and Comments but only once per given instance Members can select their favorite Bands, music, Merchandise, and Events Bands, Songs, and Events will be categorized into the type of Genre that they are and then further subcategorized into a SubGenre if necessary. It is ok for a Band or Event to fall into more then one Genre/SubGenre combination. Event date, time, and location will be posted for a given band and members can show that they will be attending the Event. An Event can be comprised of more than one Band, and multiple Events can take place at a single location on the same day Every party will be tied to at least one address and address history shall be maintained. Each party could also be tied to more then one address at a time (i.e. billing, shipping, physical) There will be stored profiles for Bands, BandMembers, and general members. So there it is, maybe a bit involved but could be a great learning tool for many hopefully as the process evolves and input is given by the community. Any input? EDIT v1.1 In response to PerformanceDBA U.3) That means no merchandise other than Band merchandise in the database. Correct ? That was my original thought but you got me thinking. Maybe the site would want to sell its own merchandise or even other merchandise from the bands. Not sure a mod to make for that. Would it require an entire rework of the Catalog section or just the identifying relationship that exists with the Band? Attempted a mod to sell both complete albums or song. Either way they would both be in electronic format only available for download. That is why I listed an Album as being comprised of Songs rather then 2 separate entities. U.5) I understand what you bring up about the circular relation with Favorite. I would like to get to this “It is either one Entity with some form of differentiation (FavoriteType) which identifies its treatment” but how to is not clear to me. What am I missing here? u.6) “Business Rules This is probably the only area you are weak in.” Thanks for the honest response. I will readdress these but I hope to clear up some confusion in my head first with the responses I have posted back to you. Q.1) Yes I would like to have Accepted, Rejected, and Blocked. I am not sure what you are referring to as to how this would change the logical model? Q.2) A person does not have to be a User. They can exist only as a BandMember. Is that what you are asking? Minor Issue Zero, One, or More…Oops I admit I forgot to give this attention when building the model. I am submitting this version as is and will address in a future version. I need to read up more on Constraint Checking to make sure I am understanding things. M.4) Depends if you envision OrderPurchase in the future. Can you expand as to what you mean here? EDIT V1.2 In response to PerformanceDBA input... Lessons learned. I was mixing the concept of Identifying / Non-Identifying and Cardinality (i.e. Genre / SubGenre), and doing so inconsistently to make things worse. Associative Tables are not required in Logical Diagrams as their many-to-many relationships can be depicted and then expanded in the Physical Model. I was overlooking the Cardinality in a lot of the relationships The importance of reading through relationships using effective Verb Phrases to reassure I am modeling what I want to accomplish. U.2) In the concept of this model it is only required to track a Venue as a location for an Event. No further data needs to be collected. With that being said Events will take place on a given EventDate and will be hosted at a Venue. Venues will host multiple events and possibly multiple events on a given date. In my new model my thinking was that EventDate is already tied to Event . Therefore, Venue will not need a relationship with EventDate. The 5th and 6th bullets you have listed under U.2) leave me questioning my thinking though. Am I missing something here? U.3) Is it time to move the link between Item and Band up to Item and Party instead? With the current design I don't see a possibility to sell merchandise not tied to the band as you have brought up. U.5) I left as per your input rather than making it a discrete Supertype/Subtype Relationship as I don’t see a benefit of having that type of roll up. Additional Revisions AR.1) After going through the exercise for FavoriteItem, I feel that Item to Review requires a many-to-many relationship so that is indicated. Necessary? Ok here we go for v1.3 I took a few days on this version, going back and forth with my design. Once the logical process is complete, as I want to see if I am on the right track, I will go through in depth what I had learned and the troubles I faced as a beginner going through this process. The big point for this version was it took throwing in some Keys to help see what I was missing in the past. Going through the process of doing a matrix proved to be of great help also. Regardless of anything, if it wasn't for the input given by PerformanceDBA I would still be a lost soul wondering in the dark. Who knows my current design might reaffirm that I still am, but I have learned a lot so I am know I at least have a flashlight in my hand. At this point in time I admit that I am still confused about identifying and non-identifying relationships. In my model I had to use non-identifying relationships with non nulls just to join the relationships I wanted to model. In reading a lot on the subject there seems to be a lot of disagreement and indecisiveness on the subject so I did what I thought represented the right things in my model. When to force (identifying) and when to be free (non-identifying)? Anyone have inputs? EDIT V1.4 Ok took the V1.3 inputs and cleaned things up for this V1.4 Currently working on a V1.5 to include attributes.

    Read the article

  • XNA Xbox 360 Content Manager Thread freezing Draw Thread

    - by Alikar
    I currently have a game that takes in large images, easily bigger than 1MB, to serve as backgrounds. I know exactly when this transition is supposed to take place, so I made a loader class to handle loading these large images in the background, but when I load the images it still freezes the main thread where the drawing takes place. Since this code runs on the 360 I move the thread to the 4th hardware thread, but that doesn't seem to help. Below is the class I am using. Any thoughts as to why my new content manager which should be in its own thread is interrupting the draw in my main thread would be appreciated. namespace FileSystem { /// <summary> /// This is used to reference how many objects reference this texture. /// Everytime someone references a texture we increase the iNumberOfReferences. /// When a class calls remove on a specific texture we check to see if anything /// else is referencing the class, if it is we don't remove it. If there isn't /// anything referencing the texture its safe to dispose of. /// </summary> class TextureContainer { public uint uiNumberOfReferences = 0; public Texture2D texture; } /// <summary> /// This class loads all the files from the Content. /// </summary> static class FileManager { static Microsoft.Xna.Framework.Content.ContentManager Content; static EventWaitHandle wh = new AutoResetEvent(false); static Dictionary<string, TextureContainer> Texture2DResourceDictionary; static List<Texture2D> TexturesToDispose; static List<String> TexturesToLoad; static int iProcessor = 4; private static object threadMutex = new object(); private static object Texture2DMutex = new object(); private static object loadingMutex = new object(); private static bool bLoadingTextures = false; /// <summary> /// Returns if we are loading textures or not. /// </summary> public static bool LoadingTexture { get { lock (loadingMutex) { return bLoadingTextures; } } } /// <summary> /// Since this is an static class. This is the constructor for the file loadeder. This is the version /// for the Xbox 360. /// </summary> /// <param name="_Content"></param> public static void Initalize(IServiceProvider serviceProvider, string rootDirectory, int _iProcessor ) { Content = new Microsoft.Xna.Framework.Content.ContentManager(serviceProvider, rootDirectory); Texture2DResourceDictionary = new Dictionary<string, TextureContainer>(); TexturesToDispose = new List<Texture2D>(); iProcessor = _iProcessor; CreateThread(); } /// <summary> /// Since this is an static class. This is the constructor for the file loadeder. /// </summary> /// <param name="_Content"></param> public static void Initalize(IServiceProvider serviceProvider, string rootDirectory) { Content = new Microsoft.Xna.Framework.Content.ContentManager(serviceProvider, rootDirectory); Texture2DResourceDictionary = new Dictionary<string, TextureContainer>(); TexturesToDispose = new List<Texture2D>(); CreateThread(); } /// <summary> /// Creates the thread incase we wanted to set up some parameters /// Outside of the constructor. /// </summary> static public void CreateThread() { Thread t = new Thread(new ThreadStart(StartThread)); t.Start(); } // This is the function that we thread. static public void StartThread() { //BBSThreadClass BBSTC = (BBSThreadClass)_oData; FileManager.Execute(); } /// <summary> /// This thread shouldn't be called by the outside world. /// It allows the File Manager to loop. /// </summary> static private void Execute() { // Make sure our thread is on the correct processor on the XBox 360. #if WINDOWS #else Thread.CurrentThread.SetProcessorAffinity(new int[] { iProcessor }); Thread.CurrentThread.IsBackground = true; #endif // This loop will load textures into ram for us away from the main thread. while (true) { wh.WaitOne(); // Locking down our data while we process it. lock (threadMutex) { lock (loadingMutex) { bLoadingTextures = true; } bool bContainsKey = false; for (int con = 0; con < TexturesToLoad.Count; con++) { // If we have already loaded the texture into memory reference // the one in the dictionary. lock (Texture2DMutex) { bContainsKey = Texture2DResourceDictionary.ContainsKey(TexturesToLoad[con]); } if (bContainsKey) { // Do nothing } // Otherwise load it into the dictionary and then reference the // copy in the dictionary else { TextureContainer TC = new TextureContainer(); TC.uiNumberOfReferences = 1; // We start out with 1 referece. // Loading the texture into memory. try { TC.texture = Content.Load<Texture2D>(TexturesToLoad[con]); // This is passed into the dictionary, thus there is only one copy of // the texture in memory. // There is an issue with Sprite Batch and disposing textures. // This will have to wait until its figured out. lock (Texture2DMutex) { bContainsKey = Texture2DResourceDictionary.ContainsKey(TexturesToLoad[con]); Texture2DResourceDictionary.Add(TexturesToLoad[con], TC); } // We don't have the find the reference to the container since we // already have it. } // Occasionally our texture will already by loaded by another thread while // this thread is operating. This mainly happens on the first level. catch (Exception e) { // If this happens we don't worry about it since this thread only loads // texture data and if its already there we don't need to load it. } } Thread.Sleep(100); } } lock (loadingMutex) { bLoadingTextures = false; } } } static public void LoadTextureList(List<string> _textureList) { // Ensuring that we can't creating threading problems. lock (threadMutex) { TexturesToLoad = _textureList; } wh.Set(); } /// <summary> /// This loads a 2D texture which represents a 2D grid of Texels. /// </summary> /// <param name="_textureName">The name of the picture you wish to load.</param> /// <returns>Holds the image data.</returns> public static Texture2D LoadTexture2D( string _textureName ) { TextureContainer temp; lock (Texture2DMutex) { bool bContainsKey = false; // If we have already loaded the texture into memory reference // the one in the dictionary. lock (Texture2DMutex) { bContainsKey = Texture2DResourceDictionary.ContainsKey(_textureName); if (bContainsKey) { temp = Texture2DResourceDictionary[_textureName]; temp.uiNumberOfReferences++; // Incrementing the number of references } // Otherwise load it into the dictionary and then reference the // copy in the dictionary else { TextureContainer TC = new TextureContainer(); TC.uiNumberOfReferences = 1; // We start out with 1 referece. // Loading the texture into memory. try { TC.texture = Content.Load<Texture2D>(_textureName); // This is passed into the dictionary, thus there is only one copy of // the texture in memory. } // Occasionally our texture will already by loaded by another thread while // this thread is operating. This mainly happens on the first level. catch(Exception e) { temp = Texture2DResourceDictionary[_textureName]; temp.uiNumberOfReferences++; // Incrementing the number of references } // There is an issue with Sprite Batch and disposing textures. // This will have to wait until its figured out. Texture2DResourceDictionary.Add(_textureName, TC); // We don't have the find the reference to the container since we // already have it. temp = TC; } } } // Return a reference to the texture return temp.texture; } /// <summary> /// Go through our dictionary and remove any references to the /// texture passed in. /// </summary> /// <param name="texture">Texture to remove from texture dictionary.</param> public static void RemoveTexture2D(Texture2D texture) { foreach (KeyValuePair<string, TextureContainer> pair in Texture2DResourceDictionary) { // Do our references match? if (pair.Value.texture == texture) { // Only one object or less holds a reference to the // texture. Logically it should be safe to remove. if (pair.Value.uiNumberOfReferences <= 1) { // Grabing referenc to texture TexturesToDispose.Add(pair.Value.texture); // We are about to release the memory of the texture, // thus we make sure no one else can call this member // in the dictionary. Texture2DResourceDictionary.Remove(pair.Key); // Once we have removed the texture we don't want to create an exception. // So we will stop looking in the list since it has changed. break; } // More than one Object has a reference to this texture. // So we will not be removing it from memory and instead // simply marking down the number of references by 1. else { pair.Value.uiNumberOfReferences--; } } } } /*public static void DisposeTextures() { int Count = TexturesToDispose.Count; // If there are any textures to dispose of. if (Count > 0) { for (int con = 0; con < TexturesToDispose.Count; con++) { // =!THIS REMOVES THE TEXTURE FROM MEMORY!= // This is not like a normal dispose. This will actually // remove the object from memory. Texture2D is inherited // from GraphicsResource which removes it self from // memory on dispose. Very nice for game efficency, // but "dangerous" in managed land. Texture2D Temp = TexturesToDispose[con]; Temp.Dispose(); } // Remove textures we've already disposed of. TexturesToDispose.Clear(); } }*/ /// <summary> /// This loads a 2D texture which represnets a font. /// </summary> /// <param name="_textureName">The name of the font you wish to load.</param> /// <returns>Holds the font data.</returns> public static SpriteFont LoadFont( string _fontName ) { SpriteFont temp = Content.Load<SpriteFont>( _fontName ); return temp; } /// <summary> /// This loads an XML document. /// </summary> /// <param name="_textureName">The name of the XML document you wish to load.</param> /// <returns>Holds the XML data.</returns> public static XmlDocument LoadXML( string _fileName ) { XmlDocument temp = Content.Load<XmlDocument>( _fileName ); return temp; } /// <summary> /// This loads a sound file. /// </summary> /// <param name="_fileName"></param> /// <returns></returns> public static SoundEffect LoadSound( string _fileName ) { SoundEffect temp = Content.Load<SoundEffect>(_fileName); return temp; } } }

    Read the article

  • eventmachine on debian fails install via rubygems

    - by Max
    this has been killing me for the last 5 hours. I don't seem to be able to get eventmachine running on my debian box. here this output: $ gem install thin Building native extensions. This could take a while... ERROR: Error installing thin: ERROR: Failed to build gem native extension. /home/eventhub/.rvm/rubies/ruby-1.9.3-p125/bin/ruby extconf.rb checking for rb_trap_immediate in ruby.h,rubysig.h... no checking for rb_thread_blocking_region()... yes checking for inotify_init() in sys/inotify.h... yes checking for writev() in sys/uio.h... yes checking for rb_wait_for_single_fd()... yes checking for rb_enable_interrupt()... yes checking for rb_time_new()... yes checking for sys/event.h... no checking for epoll_create() in sys/epoll.h... yes creating Makefile make compiling kb.cpp cc1plus: warning: command line option "-Wdeclaration-after-statement" is valid for C/ObjC but not for C++ cc1plus: warning: command line option "-Wimplicit-function-declaration" is valid for C/ObjC but not for C++ In file included from project.h:149, from kb.cpp:20: binder.h:35: warning: type qualifiers ignored on function return type In file included from project.h:150, from kb.cpp:20: em.h:84: warning: type qualifiers ignored on function return type em.h:85: warning: type qualifiers ignored on function return type em.h:86: warning: type qualifiers ignored on function return type em.h:88: warning: type qualifiers ignored on function return type em.h:89: warning: type qualifiers ignored on function return type em.h:90: warning: type qualifiers ignored on function return type em.h:91: warning: type qualifiers ignored on function return type em.h:93: warning: type qualifiers ignored on function return type em.h:99: warning: type qualifiers ignored on function return type em.h:116: warning: type qualifiers ignored on function return type em.h:125: warning: type qualifiers ignored on function return type In file included from project.h:154, from kb.cpp:20: eventmachine.h:46: warning: type qualifiers ignored on function return type eventmachine.h:47: warning: type qualifiers ignored on function return type eventmachine.h:48: warning: type qualifiers ignored on function return type eventmachine.h:50: warning: type qualifiers ignored on function return type eventmachine.h:65: warning: type qualifiers ignored on function return type eventmachine.h:66: warning: type qualifiers ignored on function return type eventmachine.h:67: warning: type qualifiers ignored on function return type eventmachine.h:68: warning: type qualifiers ignored on function return type In file included from project.h:154, from kb.cpp:20: eventmachine.h:103: warning: type qualifiers ignored on function return type eventmachine.h:105: warning: type qualifiers ignored on function return type eventmachine.h:108: warning: type qualifiers ignored on function return type compiling rubymain.cpp cc1plus: warning: command line option "-Wdeclaration-after-statement" is valid for C/ObjC but not for C++ cc1plus: warning: command line option "-Wimplicit-function-declaration" is valid for C/ObjC but not for C++ In file included from project.h:149, from rubymain.cpp:20: binder.h:35: warning: type qualifiers ignored on function return type In file included from project.h:150, from rubymain.cpp:20: em.h:84: warning: type qualifiers ignored on function return type em.h:85: warning: type qualifiers ignored on function return type em.h:86: warning: type qualifiers ignored on function return type em.h:88: warning: type qualifiers ignored on function return type em.h:89: warning: type qualifiers ignored on function return type em.h:90: warning: type qualifiers ignored on function return type em.h:91: warning: type qualifiers ignored on function return type em.h:93: warning: type qualifiers ignored on function return type em.h:99: warning: type qualifiers ignored on function return type em.h:116: warning: type qualifiers ignored on function return type em.h:125: warning: type qualifiers ignored on function return type In file included from project.h:154, from rubymain.cpp:20: eventmachine.h:46: warning: type qualifiers ignored on function return type eventmachine.h:47: warning: type qualifiers ignored on function return type eventmachine.h:48: warning: type qualifiers ignored on function return type eventmachine.h:50: warning: type qualifiers ignored on function return type eventmachine.h:65: warning: type qualifiers ignored on function return type eventmachine.h:66: warning: type qualifiers ignored on function return type eventmachine.h:67: warning: type qualifiers ignored on function return type eventmachine.h:68: warning: type qualifiers ignored on function return type In file included from project.h:154, from rubymain.cpp:20: eventmachine.h:103: warning: type qualifiers ignored on function return type eventmachine.h:105: warning: type qualifiers ignored on function return type eventmachine.h:108: warning: type qualifiers ignored on function return type compiling ssl.cpp cc1plus: warning: command line option "-Wdeclaration-after-statement" is valid for C/ObjC but not for C++ cc1plus: warning: command line option "-Wimplicit-function-declaration" is valid for C/ObjC but not for C++ In file included from project.h:149, from ssl.cpp:23: binder.h:35: warning: type qualifiers ignored on function return type In file included from project.h:150, from ssl.cpp:23: em.h:84: warning: type qualifiers ignored on function return type em.h:85: warning: type qualifiers ignored on function return type em.h:86: warning: type qualifiers ignored on function return type em.h:88: warning: type qualifiers ignored on function return type em.h:89: warning: type qualifiers ignored on function return type em.h:90: warning: type qualifiers ignored on function return type em.h:91: warning: type qualifiers ignored on function return type em.h:93: warning: type qualifiers ignored on function return type em.h:99: warning: type qualifiers ignored on function return type em.h:116: warning: type qualifiers ignored on function return type em.h:125: warning: type qualifiers ignored on function return type In file included from project.h:154, from ssl.cpp:23: eventmachine.h:46: warning: type qualifiers ignored on function return type eventmachine.h:47: warning: type qualifiers ignored on function return type eventmachine.h:48: warning: type qualifiers ignored on function return type eventmachine.h:50: warning: type qualifiers ignored on function return type eventmachine.h:65: warning: type qualifiers ignored on function return type eventmachine.h:66: warning: type qualifiers ignored on function return type eventmachine.h:67: warning: type qualifiers ignored on function return type eventmachine.h:68: warning: type qualifiers ignored on function return type In file included from project.h:154, from ssl.cpp:23: eventmachine.h:103: warning: type qualifiers ignored on function return type eventmachine.h:105: warning: type qualifiers ignored on function return type eventmachine.h:108: warning: type qualifiers ignored on function return type compiling cmain.cpp cc1plus: warning: command line option "-Wdeclaration-after-statement" is valid for C/ObjC but not for C++ cc1plus: warning: command line option "-Wimplicit-function-declaration" is valid for C/ObjC but not for C++ In file included from project.h:149, from cmain.cpp:20: binder.h:35: warning: type qualifiers ignored on function return type In file included from project.h:150, from cmain.cpp:20: em.h:84: warning: type qualifiers ignored on function return type em.h:85: warning: type qualifiers ignored on function return type em.h:86: warning: type qualifiers ignored on function return type em.h:88: warning: type qualifiers ignored on function return type em.h:89: warning: type qualifiers ignored on function return type em.h:90: warning: type qualifiers ignored on function return type em.h:91: warning: type qualifiers ignored on function return type em.h:93: warning: type qualifiers ignored on function return type em.h:99: warning: type qualifiers ignored on function return type em.h:116: warning: type qualifiers ignored on function return type em.h:125: warning: type qualifiers ignored on function return type In file included from project.h:154, from cmain.cpp:20: eventmachine.h:46: warning: type qualifiers ignored on function return type eventmachine.h:47: warning: type qualifiers ignored on function return type eventmachine.h:48: warning: type qualifiers ignored on function return type eventmachine.h:50: warning: type qualifiers ignored on function return type eventmachine.h:65: warning: type qualifiers ignored on function return type eventmachine.h:66: warning: type qualifiers ignored on function return type eventmachine.h:67: warning: type qualifiers ignored on function return type eventmachine.h:68: warning: type qualifiers ignored on function return type In file included from project.h:154, from cmain.cpp:20: eventmachine.h:103: warning: type qualifiers ignored on function return type eventmachine.h:105: warning: type qualifiers ignored on function return type eventmachine.h:108: warning: type qualifiers ignored on function return type cmain.cpp:96: warning: type qualifiers ignored on function return type cmain.cpp:107: warning: type qualifiers ignored on function return type cmain.cpp:117: warning: type qualifiers ignored on function return type cmain.cpp:127: warning: type qualifiers ignored on function return type cmain.cpp:269: warning: type qualifiers ignored on function return type cmain.cpp:279: warning: type qualifiers ignored on function return type cmain.cpp:289: warning: type qualifiers ignored on function return type cmain.cpp:299: warning: type qualifiers ignored on function return type cmain.cpp:309: warning: type qualifiers ignored on function return type cmain.cpp:329: warning: type qualifiers ignored on function return type cmain.cpp:678: warning: type qualifiers ignored on function return type compiling em.cpp cc1plus: warning: command line option "-Wdeclaration-after-statement" is valid for C/ObjC but not for C++ cc1plus: warning: command line option "-Wimplicit-function-declaration" is valid for C/ObjC but not for C++ In file included from project.h:149, from em.cpp:23: binder.h:35: warning: type qualifiers ignored on function return type In file included from project.h:150, from em.cpp:23: em.h:84: warning: type qualifiers ignored on function return type em.h:85: warning: type qualifiers ignored on function return type em.h:86: warning: type qualifiers ignored on function return type em.h:88: warning: type qualifiers ignored on function return type em.h:89: warning: type qualifiers ignored on function return type em.h:90: warning: type qualifiers ignored on function return type em.h:91: warning: type qualifiers ignored on function return type em.h:93: warning: type qualifiers ignored on function return type em.h:99: warning: type qualifiers ignored on function return type em.h:116: warning: type qualifiers ignored on function return type em.h:125: warning: type qualifiers ignored on function return type In file included from project.h:154, from em.cpp:23: eventmachine.h:46: warning: type qualifiers ignored on function return type eventmachine.h:47: warning: type qualifiers ignored on function return type eventmachine.h:48: warning: type qualifiers ignored on function return type eventmachine.h:50: warning: type qualifiers ignored on function return type eventmachine.h:65: warning: type qualifiers ignored on function return type eventmachine.h:66: warning: type qualifiers ignored on function return type eventmachine.h:67: warning: type qualifiers ignored on function return type eventmachine.h:68: warning: type qualifiers ignored on function return type In file included from project.h:154, from em.cpp:23: eventmachine.h:103: warning: type qualifiers ignored on function return type eventmachine.h:105: warning: type qualifiers ignored on function return type eventmachine.h:108: warning: type qualifiers ignored on function return type em.cpp: In member function 'bool EventMachine_t::_RunEpollOnce()': em.cpp:578: warning: 'int rb_thread_select(int, fd_set*, fd_set*, fd_set*, timeval*)' is deprecated (declared at /home/eventhub/.rvm/rubies/ruby-1.9.3-p125/include/ruby-1.9.1/ruby/intern.h:379) em.cpp:578: warning: 'int rb_thread_select(int, fd_set*, fd_set*, fd_set*, timeval*)' is deprecated (declared at /home/eventhub/.rvm/rubies/ruby-1.9.3-p125/include/ruby-1.9.1/ruby/intern.h:379) em.cpp: In member function 'bool EventMachine_t::_RunSelectOnce()': em.cpp:974: warning: 'int rb_thread_select(int, fd_set*, fd_set*, fd_set*, timeval*)' is deprecated (declared at /home/eventhub/.rvm/rubies/ruby-1.9.3-p125/include/ruby-1.9.1/ruby/intern.h:379) em.cpp:974: warning: 'int rb_thread_select(int, fd_set*, fd_set*, fd_set*, timeval*)' is deprecated (declared at /home/eventhub/.rvm/rubies/ruby-1.9.3-p125/include/ruby-1.9.1/ruby/intern.h:379) em.cpp: At global scope: em.cpp:1057: warning: type qualifiers ignored on function return type em.cpp:1079: warning: type qualifiers ignored on function return type em.cpp:1265: warning: type qualifiers ignored on function return type em.cpp:1338: warning: type qualifiers ignored on function return type em.cpp:1510: warning: type qualifiers ignored on function return type em.cpp:1593: warning: type qualifiers ignored on function return type em.cpp:1856: warning: type qualifiers ignored on function return type em.cpp:1982: warning: type qualifiers ignored on function return type em.cpp:2046: warning: type qualifiers ignored on function return type em.cpp:2070: warning: type qualifiers ignored on function return type em.cpp:2142: warning: type qualifiers ignored on function return type em.cpp:2361: fatal error: error writing to /tmp/ccdlOK0T.s: No space left on device compilation terminated. make: *** [em.o] Error 1 Gem files will remain installed in /home/eventhub/.rvm/gems/ruby-1.9.3-p125/gems/eventmachine-1.0.1 for inspection. Results logged to /home/eventhub/.rvm/gems/ruby-1.9.3-p125/gems/eventmachine-1.0.1/ext/gem_make.out Any thoughts? I read a lot of different ways to solve this issue, but none of them worked. Thanks

    Read the article

  • how to export bind and keyframe bone poses from blender to use in OpenGL

    - by SaldaVonSchwartz
    EDIT: I decided to reformulate the question in much simpler terms to see if someone can give me a hand with this. Basically, I'm exporting meshes, skeletons and actions from blender into an engine of sorts that I'm working on. But I'm getting the animations wrong. I can tell the basic motion paths are being followed but there's always an axis of translation or rotation which is wrong. I think the problem is most likely not in my engine code (OpenGL-based) but rather in either my misunderstanding of some part of the theory behind skeletal animation / skinning or the way I am exporting the appropriate joint matrices from blender in my exporter script. I'll explain the theory, the engine animation system and my blender export script, hoping someone might catch the error in either or all of these. The theory: (I'm using column-major ordering since that's what I use in the engine cause it's OpenGL-based) Assume I have a mesh made up of a single vertex v, along with a transformation matrix M which takes the vertex v from the mesh's local space to world space. That is, if I was to render the mesh without a skeleton, the final position would be gl_Position = ProjectionMatrix * M * v. Now assume I have a skeleton with a single joint j in bind / rest pose. j is actually another matrix. A transform from j's local space to its parent space which I'll denote Bj. if j was part of a joint hierarchy in the skeleton, Bj would take from j space to j-1 space (that is to its parent space). However, in this example j is the only joint, so Bj takes from j space to world space, like M does for v. Now further assume I have a a set of frames, each with a second transform Cj, which works the same as Bj only that for a different, arbitrary spatial configuration of join j. Cj still takes vertices from j space to world space but j is rotated and/or translated and/or scaled. Given the above, in order to skin vertex v at keyframe n. I need to: take v from world space to joint j space modify j (while v stays fixed in j space and is thus taken along in the transformation) take v back from the modified j space to world space So the mathematical implementation of the above would be: v' = Cj * Bj^-1 * v. Actually, I have one doubt here.. I said the mesh to which v belongs has a transform M which takes from model space to world space. And I've also read in a couple textbooks that it needs to be transformed from model space to joint space. But I also said in 1 that v needs to be transformed from world to joint space. So basically I'm not sure if I need to do v' = Cj * Bj^-1 * v or v' = Cj * Bj^-1 * M * v. Right now my implementation multiples v' by M and not v. But I've tried changing this and it just screws things up in a different way cause there's something else wrong. Finally, If we wanted to skin a vertex to a joint j1 which in turn is a child of a joint j0, Bj1 would be Bj0 * Bj1 and Cj1 would be Cj0 * Cj1. But Since skinning is defined as v' = Cj * Bj^-1 * v , Bj1^-1 would be the reverse concatenation of the inverses making up the original product. That is, v' = Cj0 * Cj1 * Bj1^-1 * Bj0^-1 * v Now on to the implementation (Blender side): Assume the following mesh made up of 1 cube, whose vertices are bound to a single joint in a single-joint skeleton: Assume also there's a 60-frame, 3-keyframe animation at 60 fps. The animation essentially is: keyframe 0: the joint is in bind / rest pose (the way you see it in the image). keyframe 30: the joint translates up (+z in blender) some amount and at the same time rotates pi/4 rad clockwise. keyframe 59: the joint goes back to the same configuration it was in keyframe 0. My first source of confusion on the blender side is its coordinate system (as opposed to OpenGL's default) and the different matrices accessible through the python api. Right now, this is what my export script does about translating blender's coordinate system to OpenGL's standard system: # World transform: Blender -> OpenGL worldTransform = Matrix().Identity(4) worldTransform *= Matrix.Scale(-1, 4, (0,0,1)) worldTransform *= Matrix.Rotation(radians(90), 4, "X") # Mesh (local) transform matrix file.write('Mesh Transform:\n') localTransform = mesh.matrix_local.copy() localTransform = worldTransform * localTransform for col in localTransform.col: file.write('{:9f} {:9f} {:9f} {:9f}\n'.format(col[0], col[1], col[2], col[3])) file.write('\n') So if you will, my "world" matrix is basically the act of changing blenders coordinate system to the default GL one with +y up, +x right and -z into the viewing volume. Then I also premultiply (in the sense that it's done by the time we reach the engine, not in the sense of post or pre in terms of matrix multiplication order) the mesh matrix M so that I don't need to multiply it again once per draw call in the engine. About the possible matrices to extract from Blender joints (bones in Blender parlance), I'm doing the following: For joint bind poses: def DFSJointTraversal(file, skeleton, jointList): for joint in jointList: bindPoseJoint = skeleton.data.bones[joint.name] bindPoseTransform = bindPoseJoint.matrix_local.inverted() file.write('Joint ' + joint.name + ' Transform {\n') translationV = bindPoseTransform.to_translation() rotationQ = bindPoseTransform.to_3x3().to_quaternion() scaleV = bindPoseTransform.to_scale() file.write('T {:9f} {:9f} {:9f}\n'.format(translationV[0], translationV[1], translationV[2])) file.write('Q {:9f} {:9f} {:9f} {:9f}\n'.format(rotationQ[1], rotationQ[2], rotationQ[3], rotationQ[0])) file.write('S {:9f} {:9f} {:9f}\n'.format(scaleV[0], scaleV[1], scaleV[2])) DFSJointTraversal(file, skeleton, joint.children) file.write('}\n') Note that I'm actually grabbing the inverse of what I think is the bind pose transform Bj. This is so I don't need to invert it in the engine. Also note I went for matrix_local, assuming this is Bj. The other option is plain "matrix", which as far as I can tell is the same only that not homogeneous. For joint current / keyframe poses: for kfIndex in keyframes: bpy.context.scene.frame_set(kfIndex) file.write('keyframe: {:d}\n'.format(int(kfIndex))) for i in range(0, len(skeleton.data.bones)): file.write('joint: {:d}\n'.format(i)) currentPoseJoint = skeleton.pose.bones[i] currentPoseTransform = currentPoseJoint.matrix translationV = currentPoseTransform.to_translation() rotationQ = currentPoseTransform.to_3x3().to_quaternion() scaleV = currentPoseTransform.to_scale() file.write('T {:9f} {:9f} {:9f}\n'.format(translationV[0], translationV[1], translationV[2])) file.write('Q {:9f} {:9f} {:9f} {:9f}\n'.format(rotationQ[1], rotationQ[2], rotationQ[3], rotationQ[0])) file.write('S {:9f} {:9f} {:9f}\n'.format(scaleV[0], scaleV[1], scaleV[2])) file.write('\n') Note that here I go for skeleton.pose.bones instead of data.bones and that I have a choice of 3 matrices: matrix, matrix_basis and matrix_channel. From the descriptions in the python API docs I'm not super clear which one I should choose, though I think it's the plain matrix. Also note I do not invert the matrix in this case. The implementation (Engine / OpenGL side): My animation subsystem does the following on each update (I'm omitting parts of the update loop where it's figured out which objects need update and time is hardcoded here for simplicity): static double time = 0; time = fmod((time + elapsedTime),1.); uint16_t LERPKeyframeNumber = 60 * time; uint16_t lkeyframeNumber = 0; uint16_t lkeyframeIndex = 0; uint16_t rkeyframeNumber = 0; uint16_t rkeyframeIndex = 0; for (int i = 0; i < aClip.keyframesCount; i++) { uint16_t keyframeNumber = aClip.keyframes[i].number; if (keyframeNumber <= LERPKeyframeNumber) { lkeyframeIndex = i; lkeyframeNumber = keyframeNumber; } else { rkeyframeIndex = i; rkeyframeNumber = keyframeNumber; break; } } double lTime = lkeyframeNumber / 60.; double rTime = rkeyframeNumber / 60.; double blendFactor = (time - lTime) / (rTime - lTime); GLKMatrix4 bindPosePalette[aSkeleton.jointsCount]; GLKMatrix4 currentPosePalette[aSkeleton.jointsCount]; for (int i = 0; i < aSkeleton.jointsCount; i++) { F3DETQSType& lPose = aClip.keyframes[lkeyframeIndex].skeletonPose.joints[i]; F3DETQSType& rPose = aClip.keyframes[rkeyframeIndex].skeletonPose.joints[i]; GLKVector3 LERPTranslation = GLKVector3Lerp(lPose.t, rPose.t, blendFactor); GLKQuaternion SLERPRotation = GLKQuaternionSlerp(lPose.q, rPose.q, blendFactor); GLKVector3 LERPScaling = GLKVector3Lerp(lPose.s, rPose.s, blendFactor); GLKMatrix4 currentTransform = GLKMatrix4MakeWithQuaternion(SLERPRotation); currentTransform = GLKMatrix4TranslateWithVector3(currentTransform, LERPTranslation); currentTransform = GLKMatrix4ScaleWithVector3(currentTransform, LERPScaling); GLKMatrix4 inverseBindTransform = GLKMatrix4MakeWithQuaternion(aSkeleton.joints[i].inverseBindTransform.q); inverseBindTransform = GLKMatrix4TranslateWithVector3(inverseBindTransform, aSkeleton.joints[i].inverseBindTransform.t); inverseBindTransform = GLKMatrix4ScaleWithVector3(inverseBindTransform, aSkeleton.joints[i].inverseBindTransform.s); if (aSkeleton.joints[i].parentIndex == -1) { bindPosePalette[i] = inverseBindTransform; currentPosePalette[i] = currentTransform; } else { bindPosePalette[i] = GLKMatrix4Multiply(inverseBindTransform, bindPosePalette[aSkeleton.joints[i].parentIndex]); currentPosePalette[i] = GLKMatrix4Multiply(currentPosePalette[aSkeleton.joints[i].parentIndex], currentTransform); } aSkeleton.skinningPalette[i] = GLKMatrix4Multiply(currentPosePalette[i], bindPosePalette[i]); } Finally, this is my vertex shader: #version 100 uniform mat4 modelMatrix; uniform mat3 normalMatrix; uniform mat4 projectionMatrix; uniform mat4 skinningPalette[6]; uniform lowp float skinningEnabled; attribute vec4 position; attribute vec3 normal; attribute vec2 tCoordinates; attribute vec4 jointsWeights; attribute vec4 jointsIndices; varying highp vec2 tCoordinatesVarying; varying highp float lIntensity; void main() { tCoordinatesVarying = tCoordinates; vec4 skinnedVertexPosition = vec4(0.); for (int i = 0; i < 4; i++) { skinnedVertexPosition += jointsWeights[i] * skinningPalette[int(jointsIndices[i])] * position; } vec4 skinnedNormal = vec4(0.); for (int i = 0; i < 4; i++) { skinnedNormal += jointsWeights[i] * skinningPalette[int(jointsIndices[i])] * vec4(normal, 0.); } vec4 finalPosition = mix(position, skinnedVertexPosition, skinningEnabled); vec4 finalNormal = mix(vec4(normal, 0.), skinnedNormal, skinningEnabled); vec3 eyeNormal = normalize(normalMatrix * finalNormal.xyz); vec3 lightPosition = vec3(0., 0., 2.); lIntensity = max(0.0, dot(eyeNormal, normalize(lightPosition))); gl_Position = projectionMatrix * modelMatrix * finalPosition; } The result is that the animation displays wrong in terms of orientation. That is, instead of bobbing up and down it bobs in and out (along what I think is the Z axis according to my transform in the export clip). And the rotation angle is counterclockwise instead of clockwise. If I try with a more than one joint, then it's almost as if the second joint rotates in it's own different coordinate space and does not follow 100% its parent's transform. Which I assume it should from my animation subsystem which I assume in turn follows the theory I explained for the case of more than one joint. Any thoughts?

    Read the article

  • Component returned failure code: 0x80600011 [nsIXSLTProcessorObsolete.transformDocument]

    - by Sean Ochoa
    So, I'm using the XSLT plugin for JQuery, and here's my code: function AddPlotcardEventHandlers(){ // some code } function reportError(exception){ alert(exception.constructor.name + " Exception: " + ((exception.name) ? exception.name : "[unknown name]") + " - " + exception.message); } function GetPlotcards(){ $("#content").xslt("../xml/plotcards.xml","../xslt/plotcards.xsl", AddPlotcardEventHandlers,reportError); } Here's the modified jquery plugin. I say that its modified because I've added callbacks for success and error handling. /* * jquery.xslt.js * * Copyright (c) 2005-2008 Johann Burkard (<mailto:[email protected]>) * <http://eaio.com> * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included * in all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS * OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF * MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN * NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, * DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR * OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE * USE OR OTHER DEALINGS IN THE SOFTWARE. * */ /** * jQuery client-side XSLT plugins. * * @author <a href="mailto:[email protected]">Johann Burkard</a> * @version $Id: jquery.xslt.js,v 1.10 2008/08/29 21:34:24 Johann Exp $ */ (function($) { $.fn.xslt = function() { return this; } var str = /^\s*</; if (document.recalc) { // IE 5+ $.fn.xslt = function(xml, xslt, onSuccess, onError) { try{ var target = $(this); var change = function() { try{ var c = 'complete'; if (xm.readyState == c && xs.readyState == c) { window.setTimeout(function() { target.html(xm.transformNode(xs.XMLDocument)); if (onSuccess) onSuccess(); }, 50); } }catch(exception){ if (onError) onError(exception); } }; var xm = document.createElement('xml'); xm.onreadystatechange = change; xm[str.test(xml) ? "innerHTML" : "src"] = xml; var xs = document.createElement('xml'); xs.onreadystatechange = change; xs[str.test(xslt) ? "innerHTML" : "src"] = xslt; $('body').append(xm).append(xs); return this; }catch(exception){ if (onError) onError(exception); } }; } else if (window.DOMParser != undefined && window.XMLHttpRequest != undefined && window.XSLTProcessor != undefined) { // Mozilla 0.9.4+, Opera 9+ var processor = new XSLTProcessor(); var support = false; if ($.isFunction(processor.transformDocument)) { support = window.XMLSerializer != undefined; } else { support = true; } if (support) { $.fn.xslt = function(xml, xslt, onSuccess, onError) { try{ var target = $(this); var transformed = false; var xm = { readyState: 4 }; var xs = { readyState: 4 }; var change = function() { try{ if (xm.readyState == 4 && xs.readyState == 4 && !transformed) { var processor = new XSLTProcessor(); if ($.isFunction(processor.transformDocument)) { // obsolete Mozilla interface resultDoc = document.implementation.createDocument("", "", null); processor.transformDocument(xm.responseXML, xs.responseXML, resultDoc, null); target.html(new XMLSerializer().serializeToString(resultDoc)); } else { processor.importStylesheet(xs.responseXML); resultDoc = processor.transformToFragment(xm.responseXML, document); target.empty().append(resultDoc); } transformed = true; if (onSuccess) onSuccess(); } }catch(exception){ if (onError) onError(exception); } }; if (str.test(xml)) { xm.responseXML = new DOMParser().parseFromString(xml, "text/xml"); } else { xm = $.ajax({ dataType: "xml", url: xml}); xm.onreadystatechange = change; } if (str.test(xslt)) { xs.responseXML = new DOMParser().parseFromString(xslt, "text/xml"); change(); } else { xs = $.ajax({ dataType: "xml", url: xslt}); xs.onreadystatechange = change; } }catch(exception){ if (onError) onError(exception); }finally{ return this; } }; } } })(jQuery); And, here's my error msg: Object Exception: [unknown name] - Component returned failure code: 0x80600011 [nsIXSLTProcessorObsolete.transformDocument] Here's the info on the browser that I'm using for testing (with firebug v1.5.4 add-on installed): Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Here's my XML: <?xml version="1.0" encoding="ISO-8859-1"?> <plotcardCollection sortby="order"> <plotcard order="2" id="1378"> <name><![CDATA[[placeholder for name of plotcard 1378]]]></name> <content><![CDATA[[placeholder for content of plotcard 1378]]]></content> <tagCollection> <tag id="3"><![CDATA[[placeholder for tag with id=3]]]></tag> <tag id="7"><![CDATA[[placeholder for tag with id=7]]]></tag> </tagCollection> </plotcard> <plotcard order="1" id="2156"> <name><![CDATA[[placeholder for name of plotcard 2156]]]></name> <content><![CDATA[[placeholder for content of plotcard 2156]]]></content> <tagCollection> <tag id="2"><![CDATA[[placeholder for tag with id=2]]]></tag> <tag id="9"><![CDATA[[placeholder for tag with id=9]]]></tag> </tagCollection> </plotcard> </plotcardCollection> Here's my XSLT: <?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:template match="/plotcardCollection"> <xsl:variable name="sortby" select="@sortby" /> <xsl:for-each select="plotcard"> <xsl:sort select="$sortby" data-type="number" order="ascending"/> <div> <!-- Start Plotcard --> <xsl:attribute name="class">Plotcard</xsl:attribute> <xsl:for-each select="@"> <xsl:value-of select="name()"/> <xsl:text>='</xsl:text> <xsl:if test="name() = 'id'"> <xsl:text>Plotcard-</xsl:text> </xsl:if> <xsl:value-of select="." /> <xsl:text>'</xsl:text> </xsl:for-each> <!-- Start Plotcard Name Section --> <div> <xsl:attribute name="class"> <xsl:text disable-output-escaping="yes">PlotcardName</xsl:text> </xsl:attribute> <xsl:value-of select="name/text()"/> </div> <!-- Start Plotcard Content Section --> <div> <xsl:attribute name="class"> <xsl:text disable-output-escaping="yes">PlotcardContent</xsl:text> </xsl:attribute> <xsl:value-of select="content/text()"/> </div> </div> </xsl:for-each> </xsl:template> </xsl:stylesheet> I'm really not sure what to do about this.... any thoughts?

    Read the article

  • Table overflow working in Chrome and IE but not Firefox

    - by Craig
    I am trying to get a layout that always takes up the entire screen, no more, no less. The layout has a header row, a 200px wide left bar (scrollable), and a scrollable content area. This works in Chrome and IE, but in Firefox the scroll bars never show nor work. Any thoughts? <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"> <html> <head> <style type="text/css"> * { margin: 0; padding: 0; } html, body { height: 100%; background-color: yellow; overflow: hidden; } #viewTable { width: 100%; height: 100%; background-color: red; } #header { height: 72px; background-color: blue; } #leftcol { vertical-align: top; width: 200px; height: 100%; background-color: green; } #menu { height: 100%; overflow: auto; } #rightcol { vertical-align: top; width: auto; height: 100%; background-color: purple; } #content { height: 100%; overflow: auto; } </style> </head> <body> </body> <table id="viewTable" border="0" cellpadding="0" cellspacing="0"> <tr> <td colspan="2" id="header"> Header </td> </tr> <tr> <td id="leftcol"> <div id="menu"> 0<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 100<br/> </div> </td> <td id="rightcol"> <div id="content"> 0<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 100<br/> </div> </td> </tr> </table> hi </html> I would have preferred to use CSS, but could not find any way to do it. The hi should no show, it is simply there to verify it does not. Thank you!

    Read the article

  • Steganography Experiment - Trouble hiding message bits in DCT coefficients

    - by JohnHankinson
    I have an application requiring me to be able to embed loss-less data into an image. As such I've been experimenting with steganography, specifically via modification of DCT coefficients as the method I select, apart from being loss-less must also be relatively resilient against format conversion, scaling/DSP etc. From the research I've done thus far this method seems to be the best candidate. I've seen a number of papers on the subject which all seem to neglect specific details (some neglect to mention modification of 0 coefficients, or modification of AC coefficient etc). After combining the findings and making a few modifications of my own which include: 1) Using a more quantized version of the DCT matrix to ensure we only modify coefficients that would still be present should the image be JPEG'ed further or processed (I'm using this in place of simply following a zig-zag pattern). 2) I'm modifying bit 4 instead of the LSB and then based on what the original bit value was adjusting the lower bits to minimize the difference. 3) I'm only modifying the blue channel as it should be the least visible. This process must modify the actual image and not the DCT values stored in file (like jsteg) as there is no guarantee the file will be a JPEG, it may also be opened and re-saved at a later stage in a different format. For added robustness I've included the message multiple times and use the bits that occur most often, I had considered using a QR code as the message data or simply applying the reed-solomon error correction, but for this simple application and given that the "message" in question is usually going to be between 10-32 bytes I have plenty of room to repeat it which should provide sufficient redundancy to recover the true bits. No matter what I do I don't seem to be able to recover the bits at the decode stage. I've tried including / excluding various checks (even if it degrades image quality for the time being). I've tried using fixed point vs. double arithmetic, moving the bit to encode, I suspect that the message bits are being lost during the IDCT back to image. Any thoughts or suggestions on how to get this working would be hugely appreciated. (PS I am aware that the actual DCT/IDCT could be optimized from it's naive On4 operation using row column algorithm, or an FDCT like AAN, but for now it just needs to work :) ) Reference Papers: http://www.lokminglui.com/dct.pdf http://arxiv.org/ftp/arxiv/papers/1006/1006.1186.pdf Code for the Encode/Decode process in C# below: using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Drawing.Imaging; using System.Drawing; namespace ImageKey { public class Encoder { public const int HIDE_BIT_POS = 3; // use bit position 4 (1 << 3). public const int HIDE_COUNT = 16; // Number of times to repeat the message to avoid error. // JPEG Standard Quantization Matrix. // (to get higher quality multiply by (100-quality)/50 .. // for lower than 50 multiply by 50/quality. Then round to integers and clip to ensure only positive integers. public static double[] Q = {16,11,10,16,24,40,51,61, 12,12,14,19,26,58,60,55, 14,13,16,24,40,57,69,56, 14,17,22,29,51,87,80,62, 18,22,37,56,68,109,103,77, 24,35,55,64,81,104,113,92, 49,64,78,87,103,121,120,101, 72,92,95,98,112,100,103,99}; // Maximum qauality quantization matrix (if all 1's doesn't modify coefficients at all). public static double[] Q2 = {1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1}; public static Bitmap Encode(Bitmap b, string key) { Bitmap response = new Bitmap(b.Width, b.Height, PixelFormat.Format32bppArgb); uint imgWidth = ((uint)b.Width) & ~((uint)7); // Maximum usable X resolution (divisible by 8). uint imgHeight = ((uint)b.Height) & ~((uint)7); // Maximum usable Y resolution (divisible by 8). // Start be transferring the unmodified image portions. // As we'll be using slightly less width/height for the encoding process we'll need the edges to be populated. for (int y = 0; y < b.Height; y++) for (int x = 0; x < b.Width; x++) { if( (x >= imgWidth && x < b.Width) || (y>=imgHeight && y < b.Height)) response.SetPixel(x, y, b.GetPixel(x, y)); } // Setup the counters and byte data for the message to encode. StringBuilder sb = new StringBuilder(); for(int i=0;i<HIDE_COUNT;i++) sb.Append(key); byte[] codeBytes = System.Text.Encoding.ASCII.GetBytes(sb.ToString()); int bitofs = 0; // Current bit position we've encoded too. int totalBits = (codeBytes.Length * 8); // Total number of bits to encode. for (int y = 0; y < imgHeight; y += 8) { for (int x = 0; x < imgWidth; x += 8) { int[] redData = GetRedChannelData(b, x, y); int[] greenData = GetGreenChannelData(b, x, y); int[] blueData = GetBlueChannelData(b, x, y); int[] newRedData; int[] newGreenData; int[] newBlueData; if (bitofs < totalBits) { double[] redDCT = DCT(ref redData); double[] greenDCT = DCT(ref greenData); double[] blueDCT = DCT(ref blueData); int[] redDCTI = Quantize(ref redDCT, ref Q2); int[] greenDCTI = Quantize(ref greenDCT, ref Q2); int[] blueDCTI = Quantize(ref blueDCT, ref Q2); int[] blueDCTC = Quantize(ref blueDCT, ref Q); HideBits(ref blueDCTI, ref blueDCTC, ref bitofs, ref totalBits, ref codeBytes); double[] redDCT2 = DeQuantize(ref redDCTI, ref Q2); double[] greenDCT2 = DeQuantize(ref greenDCTI, ref Q2); double[] blueDCT2 = DeQuantize(ref blueDCTI, ref Q2); newRedData = IDCT(ref redDCT2); newGreenData = IDCT(ref greenDCT2); newBlueData = IDCT(ref blueDCT2); } else { newRedData = redData; newGreenData = greenData; newBlueData = blueData; } MapToRGBRange(ref newRedData); MapToRGBRange(ref newGreenData); MapToRGBRange(ref newBlueData); for(int dy=0;dy<8;dy++) { for(int dx=0;dx<8;dx++) { int col = (0xff<<24) + (newRedData[dx+(dy*8)]<<16) + (newGreenData[dx+(dy*8)]<<8) + (newBlueData[dx+(dy*8)]); response.SetPixel(x+dx,y+dy,Color.FromArgb(col)); } } } } if (bitofs < totalBits) throw new Exception("Failed to encode data - insufficient cover image coefficients"); return (response); } public static void HideBits(ref int[] DCTMatrix, ref int[] CMatrix, ref int bitofs, ref int totalBits, ref byte[] codeBytes) { int tempValue = 0; for (int u = 0; u < 8; u++) { for (int v = 0; v < 8; v++) { if ( (u != 0 || v != 0) && CMatrix[v+(u*8)] != 0 && DCTMatrix[v+(u*8)] != 0) { if (bitofs < totalBits) { tempValue = DCTMatrix[v + (u * 8)]; int bytePos = (bitofs) >> 3; int bitPos = (bitofs) % 8; byte mask = (byte)(1 << bitPos); byte value = (byte)((codeBytes[bytePos] & mask) >> bitPos); // 0 or 1. if (value == 0) { int a = DCTMatrix[v + (u * 8)] & (1 << HIDE_BIT_POS); if (a != 0) DCTMatrix[v + (u * 8)] |= (1 << HIDE_BIT_POS) - 1; DCTMatrix[v + (u * 8)] &= ~(1 << HIDE_BIT_POS); } else if (value == 1) { int a = DCTMatrix[v + (u * 8)] & (1 << HIDE_BIT_POS); if (a == 0) DCTMatrix[v + (u * 8)] &= ~((1 << HIDE_BIT_POS) - 1); DCTMatrix[v + (u * 8)] |= (1 << HIDE_BIT_POS); } if (DCTMatrix[v + (u * 8)] != 0) bitofs++; else DCTMatrix[v + (u * 8)] = tempValue; } } } } } public static void MapToRGBRange(ref int[] data) { for(int i=0;i<data.Length;i++) { data[i] += 128; if(data[i] < 0) data[i] = 0; else if(data[i] > 255) data[i] = 255; } } public static int[] GetRedChannelData(Bitmap b, int sx, int sy) { int[] data = new int[8 * 8]; for (int y = sy; y < (sy + 8); y++) { for (int x = sx; x < (sx + 8); x++) { uint col = (uint)b.GetPixel(x,y).ToArgb(); data[(x - sx) + ((y - sy) * 8)] = (int)((col >> 16) & 0xff) - 128; } } return (data); } public static int[] GetGreenChannelData(Bitmap b, int sx, int sy) { int[] data = new int[8 * 8]; for (int y = sy; y < (sy + 8); y++) { for (int x = sx; x < (sx + 8); x++) { uint col = (uint)b.GetPixel(x, y).ToArgb(); data[(x - sx) + ((y - sy) * 8)] = (int)((col >> 8) & 0xff) - 128; } } return (data); } public static int[] GetBlueChannelData(Bitmap b, int sx, int sy) { int[] data = new int[8 * 8]; for (int y = sy; y < (sy + 8); y++) { for (int x = sx; x < (sx + 8); x++) { uint col = (uint)b.GetPixel(x, y).ToArgb(); data[(x - sx) + ((y - sy) * 8)] = (int)((col >> 0) & 0xff) - 128; } } return (data); } public static int[] Quantize(ref double[] DCTMatrix, ref double[] Q) { int[] DCTMatrixOut = new int[8*8]; for (int u = 0; u < 8; u++) { for (int v = 0; v < 8; v++) { DCTMatrixOut[v + (u * 8)] = (int)Math.Round(DCTMatrix[v + (u * 8)] / Q[v + (u * 8)]); } } return(DCTMatrixOut); } public static double[] DeQuantize(ref int[] DCTMatrix, ref double[] Q) { double[] DCTMatrixOut = new double[8*8]; for (int u = 0; u < 8; u++) { for (int v = 0; v < 8; v++) { DCTMatrixOut[v + (u * 8)] = (double)DCTMatrix[v + (u * 8)] * Q[v + (u * 8)]; } } return(DCTMatrixOut); } public static double[] DCT(ref int[] data) { double[] DCTMatrix = new double[8 * 8]; for (int v = 0; v < 8; v++) { for (int u = 0; u < 8; u++) { double cu = 1; if (u == 0) cu = (1.0 / Math.Sqrt(2.0)); double cv = 1; if (v == 0) cv = (1.0 / Math.Sqrt(2.0)); double sum = 0.0; for (int y = 0; y < 8; y++) { for (int x = 0; x < 8; x++) { double s = data[x + (y * 8)]; double dctVal = Math.Cos((2 * y + 1) * v * Math.PI / 16) * Math.Cos((2 * x + 1) * u * Math.PI / 16); sum += s * dctVal; } } DCTMatrix[u + (v * 8)] = (0.25 * cu * cv * sum); } } return (DCTMatrix); } public static int[] IDCT(ref double[] DCTMatrix) { int[] Matrix = new int[8 * 8]; for (int y = 0; y < 8; y++) { for (int x = 0; x < 8; x++) { double sum = 0; for (int v = 0; v < 8; v++) { for (int u = 0; u < 8; u++) { double cu = 1; if (u == 0) cu = (1.0 / Math.Sqrt(2.0)); double cv = 1; if (v == 0) cv = (1.0 / Math.Sqrt(2.0)); double idctVal = (cu * cv) / 4.0 * Math.Cos((2 * y + 1) * v * Math.PI / 16) * Math.Cos((2 * x + 1) * u * Math.PI / 16); sum += (DCTMatrix[u + (v * 8)] * idctVal); } } Matrix[x + (y * 8)] = (int)Math.Round(sum); } } return (Matrix); } } public class Decoder { public static string Decode(Bitmap b, int expectedLength) { expectedLength *= Encoder.HIDE_COUNT; uint imgWidth = ((uint)b.Width) & ~((uint)7); // Maximum usable X resolution (divisible by 8). uint imgHeight = ((uint)b.Height) & ~((uint)7); // Maximum usable Y resolution (divisible by 8). // Setup the counters and byte data for the message to decode. byte[] codeBytes = new byte[expectedLength]; byte[] outBytes = new byte[expectedLength / Encoder.HIDE_COUNT]; int bitofs = 0; // Current bit position we've decoded too. int totalBits = (codeBytes.Length * 8); // Total number of bits to decode. for (int y = 0; y < imgHeight; y += 8) { for (int x = 0; x < imgWidth; x += 8) { int[] blueData = ImageKey.Encoder.GetBlueChannelData(b, x, y); double[] blueDCT = ImageKey.Encoder.DCT(ref blueData); int[] blueDCTI = ImageKey.Encoder.Quantize(ref blueDCT, ref Encoder.Q2); int[] blueDCTC = ImageKey.Encoder.Quantize(ref blueDCT, ref Encoder.Q); if (bitofs < totalBits) GetBits(ref blueDCTI, ref blueDCTC, ref bitofs, ref totalBits, ref codeBytes); } } bitofs = 0; for (int i = 0; i < (expectedLength / Encoder.HIDE_COUNT) * 8; i++) { int bytePos = (bitofs) >> 3; int bitPos = (bitofs) % 8; byte mask = (byte)(1 << bitPos); List<int> values = new List<int>(); int zeroCount = 0; int oneCount = 0; for (int j = 0; j < Encoder.HIDE_COUNT; j++) { int val = (codeBytes[bytePos + ((expectedLength / Encoder.HIDE_COUNT) * j)] & mask) >> bitPos; values.Add(val); if (val == 0) zeroCount++; else oneCount++; } if (oneCount >= zeroCount) outBytes[bytePos] |= mask; bitofs++; values.Clear(); } return (System.Text.Encoding.ASCII.GetString(outBytes)); } public static void GetBits(ref int[] DCTMatrix, ref int[] CMatrix, ref int bitofs, ref int totalBits, ref byte[] codeBytes) { for (int u = 0; u < 8; u++) { for (int v = 0; v < 8; v++) { if ((u != 0 || v != 0) && CMatrix[v + (u * 8)] != 0 && DCTMatrix[v + (u * 8)] != 0) { if (bitofs < totalBits) { int bytePos = (bitofs) >> 3; int bitPos = (bitofs) % 8; byte mask = (byte)(1 << bitPos); int value = DCTMatrix[v + (u * 8)] & (1 << Encoder.HIDE_BIT_POS); if (value != 0) codeBytes[bytePos] |= mask; bitofs++; } } } } } } } UPDATE: By switching to using a QR Code as the source message and swapping a pair of coefficients in each block instead of bit manipulation I've been able to get the message to survive the transform. However to get the message to come through without corruption I have to adjust both coefficients as well as swap them. For example swapping (3,4) and (4,3) in the DCT matrix and then respectively adding 8 and subtracting 8 as an arbitrary constant seems to work. This survives a re-JPEG'ing of 96 but any form of scaling/cropping destroys the message again. I was hoping that by operating on mid to low frequency values that the message would be preserved even under some light image manipulation.

    Read the article

< Previous Page | 166 167 168 169 170